Process Structuring

ion is a means of avoiding unwanted complexity. Previously we indicated how very complex processes could be obtained from simple ones by combination, and noted that their separate correctness was insufficient to guarantee correctness of the combination. Abstraction plays a crucial role in mastering the complexity of such combinations. It allows, for example, a correctness proof for an entire system to be constructed from separate proofs for each process (under certain assumptions about its environment), plus a proof of cooperation (i.e., that all environmental assumptions are satisfied). The use of abstraction to establish properties of combinations of processes is not new [13]. It is tempting to assume that an abstraction of a combination of processes is the same as the combination of their separate abstractions [20]. Unfortunately this is not generally true. EXAMPLE: Consider a combined group of processes that are synchronized by means of P and V operations on semaphores, and an interpretation that "loses" the values of the semaphores. Now the abstraction (under this interpretation) of the combination will still reflect this synchronization, while the combination of the abstractions cannot be synchronized by the "lost" semaphores.ion of a combination of processes is the same as the combination of their separate abstractions [20]. Unfortunately this is not generally true. EXAMPLE: Consider a combined group of processes that are synchronized by means of P and V operations on semaphores, and an interpretation that "loses" the values of the semaphores. Now the abstraction (under this interpretation) of the combination will still reflect this synchronization, while the combination of the abstractions cannot be synchronized by the "lost" semaphores. Cornputmg Surveys, Vol 5, No I, March 1973 Recently, some attention has been devoted to the problem of finding restrictions which ensure that a combination of abstractions accurately models the abstraction of the, combination, or, equivalently, that combinations of refinements actually are a refinement of the intended combination [24). Suppose that we wish to establish the correctness and cooperation of a group of combined processes. First, we may use abstractions which select only the state variable sets of single processes; each image process now represents a single component of the combination, which may be studied separately. (The image will, of course, reflect nondeterministic changes in the input variables caused by other processes.) Next, we may study the abstraction ~hich reduces each sequence of actions within a single process that do not involve Input-output to a single action. Finally, if we ensure the mutual exclusion of the sequences of actions which constitute the input-output operations of separate processes, we can safely use the abstraction in which each such operation becomes a single action. There are two basic ways of achieving mutual exclusion of operations in a system involving asynchronous combination. Recalling the techniques discussed m Section 3.4, the availability of even fairly simple operations which are mutually exclusive may be used to ensure the mutual exclusion of operations consisting of arbitrarily many actions. Thus, this aspect of the correctness of a system can be treated as a recursive problem, with the mutual exclusion of operations on each level dependent on the achievement of mutual exclusion on a lower level. Of course, this recursion must terminate. It seems that the only technique for achiewng mutual exclusions ~hich is not based on a lower-level mutual exclusion mvolves an active clocking process which "polls" the processes it is clocking, and allows the critical operations to proceed one at a time. To date, practical applications of abstraction and combination in structuring complex systems have relied on informal conditions to assure that arguments about abstractions could be carried over to their refinements. Process Structuring • 25 This has, for example, been the case in the work of Dijkstra [13] and of Zurcher and Randell [37]. Both papers concern design methodologies in which the concept of levels of abstraction plays a central role. The former paper describes the design and structure of the "THE" multiprogramming system. The outstanding feature of this design methodology is the careful use of structure (in particular, levels) to enable the designers to satisfy themselves, a priori, as to the logical "correctness" of the system. The aim is to show that whenever a process is presented with a task, it will, under all circumstances, complete the task within a finite time and return to its "homing position," ready to accept a new task. The proof proceeds in three stages: no process, while performing a single task, can lead to the generation of an infinite number of further tasks; when all processes have returned to their homing positions, no uncompleted tasks remain; there is no possibility of deadlock, so all processes must ultimately return to their homing positions. The feasibility of proofs of conjectures about systems as complex as the "THE" system depends strongly on the degree to which reliance on enumerative reasoning can be minimized [14]. The concept of multilevel processes is very useful in this regard. One can represent a group of sequential processes by a single image process, and prove that if this can progress, so can each of the set of processes of which it is an image. In further arguments it is then sufficient to satisfy oneself that the image process will always be able to progress. This technique can substantially reduce the number of situations which must be considered at each stage of the proof. Dijkstra also notes that this approach has significant advantages in testing a system as it is implemented. " I t seems to be the designer's responsibility to construct his mechanism in such a way--i.e. so effectively structured--that at each stage of the testing procedure the number of relevant test cases will be so small that he can try them all and that ~hat is being tested will be so perspicuous that he will not have overlooked any situation." [13] The use of multilevel processes described Computing Surveys, Vol. 5, No. 1, M a r c h 1973 26 • J . J . Homing and B. Randell by Zurcher and Randell [37], on the other hand, grew out of the desire to simulate the design of a complex system as the design took shape. Thus, the simulation .would gradually evolve and grow, and possibly become the actual system. This naturally placed very severe demands on the understandability and modifiability of the simulation program, which were met, at least in part, by constructing it as a set of distinct levels. Each level represented, at an appropriate degree of abstraction, the state of the system and those actions of the system best described in terms of that particular abstraction. The number of sequential processes on a given level would be chosen independently of the number on any other level. (For example, one level might represent each of the dynamically varying number of jobs in the system as a sequential process; another level, each of the hardware processors as a sequential process.) There was a distinct methodological difference between these two efforts. Dijkstra's approach consisted of successively forming simpler images of lower-level operations, whereas Zurcher and Randell's approach consisted of successively forming refinements of higher-level actions. There is probably no single "correct" order in which to take a series of design, decisions, though it can usually be agreed that some orderings are better than others. Almost invariably, some early decisions (thought to have been clearly correct when they were made) will turn out to have been premature. A more extensive discussion of this topic is contained in [27]. 6. APPLICATIONS OF STRUCTURE "The fact, then, that many complex systems have a nearly decomposable, hierarchic structure is a major facilitating factor enabling us to understand, to describe, and even to see such systems and their parts. Or perhaps the proposition should be put the other way round. If there are important systems in the world which are complex without being hierarchic, they may to a considerable extent escape our observation and our understanding. Analysis of their behavior would involve such detailed knowledge and calculation of the interactions of the elementary parts that it would be beyond our capacities of memory or computation." [30] Structuring techniques, and formalisms for their description, are of value only as they are applied. Our formalism has been developed because it, facilitates the careful consideration of both combination and abstraction ~4thin a uniform conceptual framework; we have concentrated on these two techniques because they are of profound importance in structuring the design and implementation of complex systems. Previous sections have discussed applications of particular techniques; in this concluding section we turn to more general uses. The importance of structuring is a result of its usefulness in mastering complexity. This applies whether one is trying to understand an existing system, or to design a proposed new system. The goal is to profit from this "mastery" by finding better ways of producing better systems, and, as an almost automatic by-product, better methods of documenting systems. However, it is important to recognize that structuring in itself is not necessarily beneficial; bad or excessive structuring may be valueless or even harmful. EXAMPLE: A program which has been divided into too many subroutines may not only be unreadable, but may also execute very inefficiently. The appropriate use of structure is still a creative task, and is, in our opinion, a central factor of any system designer's responsibility. "When we cannot grasp a system as a whole, we try to find divisions such that we can understand each part separately, and also under

[1]  Peter Wegner Data structure models for programming languages , 1971, SIGP.

[2]  Maurice V. Wilkes,et al.  Time-sharing computer systems , 1968 .

[3]  Peter J. Denning,et al.  Third Generation Computer Systems , 1971, CSUR.

[4]  Per Brinch Hansen,et al.  The nucleus of a multiprogramming system , 1970, CACM.

[5]  Butler W. Lampson,et al.  A scheduling philosophy for multi-processing systems , 1967, SOSP.

[6]  Peter Bryant Levels of computer systems , 1966, CACM.

[7]  Hugh C. Lauer Correctness in operating systems , 1973 .

[8]  Melvin E. Conway,et al.  Design of a separable transition-diagram compiler , 1963, CACM.

[9]  Kristen Nygaard,et al.  SIMULA: an ALGOL-based simulation language , 1966, CACM.

[10]  Jerome H. Saltzer,et al.  Traffic control in a multiplexed computer system , 1966 .

[11]  Jack B. Dennis,et al.  Programming semantics for multiprogrammed computations , 1966, CACM.

[12]  Marvin Minsky,et al.  Computation : finite and infinite machines , 2016 .

[13]  Edsger W. Dijkstra,et al.  The structure of the “THE”-multiprogramming system , 1968, CACM.

[14]  Wladyslaw M. Turski SODA - A Dual Activity Operating System , 1968, Comput. J..

[15]  David A. Bridger Comments on ”levels of computer systems“ , 1967, CACM.

[16]  J. E. Thornton,et al.  Parallel operation in the control data 6600 , 1964, AFIPS '64 (Fall, part II).

[17]  A. Turing On Computable Numbers, with an Application to the Entscheidungsproblem. , 1937 .

[18]  Donald E. Knuth,et al.  Additional comments on a problem in concurrent programming control , 1966, CACM.

[19]  A. Nico Habermann Synchronization of Communicating Processes , 1972, Commun. ACM.

[20]  John B. Johnston,et al.  Structure of multiple activity algorithms , 1969, SOSP '69.

[21]  Stephen W. Smoliar,et al.  Music theory—a programming linguistic approach , 1972, ACM '72.

[22]  Stephen W. Smoliar,et al.  A Parallel Processing Model of Musical Structures , 1971 .

[23]  J. Horning,et al.  Structuring Complex Processes , 1969 .

[24]  Edsger W. Dijkstra,et al.  Solution of a problem in concurrent programming control , 1965, CACM.

[25]  Butler W. Lampson,et al.  A scheduling philosophy for multiprocessing systems , 1968, CACM.

[26]  Edsger W. Dijkstra,et al.  Notes on structured programming , 1970 .

[27]  Allen Newell,et al.  Computer Structures: Readings and Examples, , 1971 .

[28]  A. N. Habermann,et al.  On the harmonious co-operation of abstract machines , 1967 .

[29]  W. J. Chandler,et al.  Interference between communicating parallel processes , 1972, CACM.