A Brief History of Modula and Lilith
N. Wirth
Abstract
In the years 1978-1980 the workstation Lilith was developed at ETH Zurich. It featured a microprogrammed processor, a high-resolution display, and a mouse. The entire software was programmed in Modula-2, a language derived from Pascal within the Lilith project. It featured the module with explicit interface specifications, and its implementation included the facility of separate compilation of modules with complete type checking.
Introduction
In order that a sizeable, technical project be undertaken, two prerequisites must be established. There must be a manifest need for a solution of a relevant problem, and there must exist an idea leading towards a solution of the problem. Such conditions emerged during a sabbatical leave which the author had the good fortune to spend at the Xerox Palo Alto Research Center (PARC) in 1977. The problem I am referring to was the need for a structured programming language for building large, complex systems, and the solution, or rather the feature needed, was a construct for expressing units of separately compiled system components, now known as modules.
I had become confronted with the language Mesa [1], developed at PARC, which contained such a construct. However, I felt that the rather baroque descendant of Pascal could be molded better in accordance with the spirit of Pascal by a new attempt, by defining a more frugal language [2, 3]. Its name Modula witnesses the dominance of the module concept in our concerns, a name which – perhaps unfortunately – was chosen in preference over Pascal++.
The need for a structured language with a module facility was less pronounced in the software community at large than in our immediate environment. An explanation of this requires some digression. The computing facilities available in 1977 were essentially large scale mainframes hosting sophisticated time-sharing systems, accessible only via terminals and remote satellites. The revolutionary concept of the powerful, personal workstation – the Alto computer developed at PARC – appeared to me like a revelation [4]. I was immediately convinced that there was no point in continuing development of software, except if based on and oriented towards this novel computing environment. However, such devices not being available on the market, there remained only one path to proceed, namely to design and build one on our own. Again, there was a recogized need and an idea of a solution. The project produced the workstation Lilith [5, 6].
There is no point in creating new hardware without new software. A basic operating system, utility programs, and first applications were to be developed concurrently, and therefore a programming language and its compiler were required as well. In fact, the primary incentive for designing Modula-2 was the need for a simple, allround language capable of expressing the whole range of programs needed to render Lilith into a powerful software development tool. The explicit goal was to use one and the same language for all Lilith software. Evidently, Modula and Lilith grew as a couple, and it would be futile to record the history of one without that of the other.
Although a preliminary document stating certain goals and concepts of the new language was written in 1977, the effective language design took place in 1978-79. Concurrently, a compiler implementation project was launched. The available machinery was a single DEC PDP-11 with a 64K byte store. The single-pass strategy of our Pascal compilers could not be adopted; a multipass approach was unavoidable in view of the small memory. It had actually been the Mesa implementation at PARC which had proved possible what I had believed to be impracticable, namely to build a complete compiler operating on a small computer. The first Modula compiler, written by K. van Le (1977), consisted of 7 passes, each generating sequential output (intermediate code) written onto the (2 MB) disk. This number was reduced in a second design by U. Ammann to 5 passes (1979). The first pass, the scanner, generated a token string and a hash table of identifiers. The second pass (parser) performed syntax analysis, and the third the task of type checking. Passes 4 and 5 were devoted to code generation. This compiler was operational in early 1979.
Lilith
By this time the first prototype of Lilith, designed by R. Ohran and the author, was barely operational too. The primary design goal had been an architecture optimally tailored for interpreting M-code, the Modula compiler’s equivalent of Pascal’s P-code. It must be recalled that the era of unlimited memory lay still 10 years in the future. Hence, high code density was considered to be of paramount importance for complex system implementation on small workstations. Lilith was organized as a word-addressed 16-bit computer, M-code as a byte stream. Memory size was 2^16 words (128K bytes). The prototype was built with 4K x 1bit parts.
The best choice for obtaining high code density was to define a fairly large variety of instructions, some of them rather complex, and to build the hardware as a micro-code interpreter. Each M-code instruction consisted of one or several bytes. The first byte was the operation code according to which a sequence of microcode instructions was selected for interpretation. The advantages of this scheme were manifold:
1. The actual hardware could be reasonably simple and therefore fast. The clock frequency being about 7 MHz, a cycle time of 140ns resulted. ALU-functions were addition and logical operations.
2. Simple M-code instructions were implementable with very short sequences of two or three micro-instructions only, requiring an execution time of 280 or 420 ns.
3. More complex instructions, such as multiplication and division, could be implemented by longer sequences of micro-instructions without requiring additional hardware (19 instr. for multiplication, 36 for division).
4. The same operation could be represented with different address or value parameter lengths of 1, 2, or 4 bytes, again without complicating hardware.
Two factors contributed primarily to the achieved high code density, which turned out to be superior to commercial processors by the remarkable factor of 2.5 (M68000) to 3.5 (I8086).
1. The use of several address and operand lengths of 4, 8, 16, and 32 bits. It turned out that more than 70% of all instruction parameters had values between 0 and 15, in which case the operand was packed together with the operation code in a single byte.
2. Lilith was equipped with a stack for intermediate results occuring in the evaluation of expressions. This stack was implemented as a 16-word, fast SRAM. M-code therefore contained no register numbers, as they were implied by the stack scheme.
The most visible innovation brought about by the personal workstation was the high-resolution, bit-mapped display, dramatically contrasting with the then customary displays with 24 lines of 80 characters. Together with the Mouse, it opened a new era of interactive man-machine interfaces based on windows and cursors, scroll bars and title bars, pop-menus and icons. The bit-mapped display with its data stored in main memory required special facilities to generate the video signal with sufficiently high bandwidth. Lilith featured a display of 768 x 592 pixels, which was improved in later versions to the vertical format with 704 x 928 pixels; 50 interlaced half frames required a bandwidth of almost 30 MHz.
The micro-code scheme proved to be most convenient for implementing a small set of raster operations for drawing lines, drawing characters, copying and replicating pixel-blocks. These operations were directly represented as single M-code instructions. The corresponding micro-code routines performed raster operations with spectacular efficiency. As far as the hardware was concerned, the only raster-op specific facility was a barrel shifter allowing an arbitray shift amount in a single micro-cycle. No commercial microprocessor offered a comparable facility.
In the meantime, a new Modula compiler was designed by L. Geissmann and Ch. Jacobi with the PDP-11 compiler as a guide, but taking advantage of the features of Lilith. It consisted of four passes, code generation being simplified by the new architecture. Development took place on the PDP-11, followed by transporting it to Lilith. In the same manner, parts of the operating system Medos was implemented by S. Knudsen – a system essentially following in the footsteps of batch systems with program load and execute commands entered from the keyboard. Concurrently, display and window software was designed by Ch. Jacobi. It served as the basis of the first application programs, such as a text editor – mostly used for program editing – featuring the well-known techniques of a cursor/mouse interface and pop-up menus.
Modula-2 was now in daily use and quickly proved its worth as a powerful, efficiently implemented language. In December 1980, the first pilot series of 20 Liliths, manufactured in Utah under the supervision of R. Ohran, were delivered to ETH Zurich. Further software development proceeded with a more than a 20-fold hardware power at our disposal. A genuine personal workstation environment had successfully been established.
Modula
From the preceding account it should become clear under which conditions Modula came into existence. It is evident that our primary concern was not a language satisfying all wishes of all programmers, but the creation of a sufficiently powerful tool for implementing an entire software system in a short time with the limited manpower at hand. Modula-2 was, so to speak, a subordinate goal, a means to an end.
Modules and separate compilation
The decision to move from Pascal to a successor was primarily due to the high importance attributed to a clean formulation of the module concept and its implementation as a facility for separate compilation. Therefore, the solution adopted in several commercial extensions of Pascal, namely the merging of units in source form, was rejected. The notion of modules with explicit lists of imported and exported identifiers was adopted from Mesa. Also the symbol file as precompiled information about a module’s exported objects, stemmed from Mesa. The module concept gave rise to a new view of Lilith’s operating system: the traditional division into system and application programs was replaced by a single hierarchy of modules which could be expanded or shrunk as momentary needs dictated.
Modula’s idea of a module was that of a syntactic unit encapsulated by a wall through which identifiers are invisible except if listed in the export or import list. Since syntactic constructs are usually defined recursively, modules are nestable with obvious consequences for identifier visibility. The inner, or local module turned out to be of limited usefulness in practice and was rarely used. Considering that implementations must handle global and local modules differently – the global ones are separately compilable – local modules should probably not have been introduced in the first place. Import lists could assume one of two forms. The simpler form consists of a list of module identifiers:
IMPORT M0, M1;
An object x exported from module M1, for example, is then referenced in a client of M1 by the qualified identifier M1.x. The second form of import list seeks to abbreviate program texts, letting the qualification be omitted by specifying the source of an identifier directly:
FROM M1 IMPORT x, y, z;
Like most abbreviations, its value is rather questionable. Not only does the compiler become more complicated, but certain negative side-effects are unavoidable: identifier clashes. If two modules export an object with the same name, a clash occurs if they are imported in the second form, even if the identifier in question was not used.
On the other hand, critics of Modula expressed the opinion that our concept was too simple and restrictive. Their view of the module interface – i.e. of Modula’s definition part – was that of a selection of the module’s set of objects. Modula allowed to define a single selection only, whereas a multitude of different selections for different clients might be useful. This facility was implemented for the language Cedar [7]. It gives rise to further extensions and complications. An obvious desire is to derive unions and intersections of selections.
A further point perhaps worth mentioning in this connection is the handling of exported enumeration types. The desire to avoid long export lists led to the (exceptional) rule that the export of an enumeration type identifier implies the export of all constant identifiers of that type. As nice as this may sound for the abbreviation enthusiast, it also has negative consequences, again in the form of identifier clashes. This occurs if two enumeration types are imported which happen to have at least one common constant identifier.
Furthermore, identifiers may now appear that are neither locally declared, nor qualified by a module name, nor visible in an import list; an entirely unexpected situation in a structured language.
Static type checking
Modula-2’s design followed uncompromisingly the rule – the dogma even – of strictly static typing, thereby empowering a compiler to perform all type consistency checks and avoiding any of them to be delayed until run-time.
In this respect, Pascal contained a few mistakes that had to be mended. One of them was the incompleteness of parameter specifications adopted from Algol 60 as exemplified by the following declarations:
PROCEDURE P(x, y: INTEGER); BEGIN … END ;
PROCEDURE Q(p: PROCEDURE);
VAR a, b: REAL;
BEGIN … p(a + b) … END ;
… Q(P) …
Here, the call P(a+b) does not comply with the parameter specifications of P, but this fact cannot be discovered by a static text analysis. Modula mended this deficiency by requiring complete parameter specifications:
PROCEDURE Q(p: PROCEDURE (REAL, REAL));
Pascal permitted the use of integer as well as real operands in expressions (mixed expressions), postulating automatic conversion of number representation wherever necessary. The presence of a large number of basic types complicates the generation of code considerably, and hence it was decided that no automatic conversion would be generated. This, in turn, required that the language rules must prohibit mixed expressions. This decision is indeed questionable, but perhaps understandable in view of our own immediate needs, which were very modest in the realm of arithmetic.
The type CARDINAL
How, then, can the introduction of type CARDINAL be explained, of natural numbers as a further basic data type? The need arose from the necessity to represent addresses as a numeric type. Lilith being a 16-bit machine with a store of 2^16 words, a useless sign bit could not be afforded. As unproblematic as this new type at first appeared, it gave rise to several problems. How, for example, is a comparison i < c to be implemented efficiently?
The question is simply avoided, if mixed expressions are prohibited. Two distinct sets of instructions are required for integer (signed) and cardinal (unsigned) arithmetic. Even if the same instruction can be used for addition, there exists a difference in the definition of overflow. The situation is even more problematic for division. In mathematics, the quotient q = x DIV y and the remainder r = x MOD y are defined by the relationships
x = q*y + r, 0 <= r < y
Integer division, in contrast to real division, is asymmetric with respect to 0. For example
10 DIV 3 = 3 10 MOD 3 = 1
-10 DIV 3 = -4 -10 MOD 3 = 2
Most computers, however, offer instructions that are wrong with respect to the established definition. They treat integer division like real division with symmetry respective to 0:
(-x) DIV y = -(x DIV y)
(-x) MOD y = -(x MOD y)
If integer division is implemented correctly, divisions by powers of 2 can be made more efficient by simple shifts. In the case of wrong division instructions, this is not possible:
x * 2^n = left(x, n) x DIV 2^n = right(x, n)
The following processors, among others, perform integer division contrary to its mathematical definition: Motorola M680x0, Intel 80×86, Intel 80960, DEC VAX, IBM Power. Even worse, the language Ada institutionalizes this mistake by declaring it as standard.
We decided to adopt the wrong arithmetic in order to stay consistent with the mistakes others were demanding. In hindsight this was, of course, a grave mistake. The compiler uses shift instructions for multiplying and dividing by powers of 2 in the case of expressions of type CARDINAL only.
In summary, then, the type CARDINAL was justified by practical needs for 16-bit address handling. From the point of view of mathematicians (who have invented negative numbers for the sake of algebraic completeness), however, it must be called a step backwards, as it introduced difficulties for the programmer that did not exist beforehand. Consider, for example, the following statements performing an operation Q for x = N-1, … , 1, 0:
x := N-1;
WHILE x >= 0 DO Q; x := x-1 END
If x is of type INTEGER, the program is correct; but if it is of type CARDINAL, it leads to an underflow which is avoided by the correct, alternative formulation
x := N;
WHILE x > 0 DO x := x-1; Q END
The advent of the 32-bit computer some 6 years later provided the welcome opportunity to forget the type CARDINAL. The episode shows how inadequacies in hardware may force the language implementor to accept complications and compromises, if he wishes to fully exploit the available facilities. The lesson is, that a memory size of 2n requires arithmetic with (at least) n+1 bit integers.
Procedure types
An uncontroversial, fairly straight-forward, and most essential innovation was the procedure type, also adopted from Mesa. In a restricted form it had been present also in Pascal, namely in the form of parametric procedures. Hence, the concept needed only to be generalized, i.e. made applicable to parameters and variables. Although the facility was sparingly used in Lilith software – with the exception of later editors and some instances in the operating system – it is nothing less than the basis for object-oriented programming, with record (object) fields having procedures as “values”. Nevertheless, Modula-2 cannot be claimed to support object-oriented programming.
The single facility missing is one for building new data types based on others, i.e. of types that are (upward) compatible with existing types, or – in object-oriented terminology – that inherit properties from existing types. This essential facility was introduced in Modula’s successor Oberon [8, 9], where it was appropriately called type extension [10].
Although Modula does not support object-oriented programming, it at least makes it possible through the use of the type ADDRESS and the rule that an address value may be assigned to any pointer variable. This implies of course that type checking be overruled, a sacrifice of security that weighs heavily. The “trick” lies in providing every record type that is to be extensible with an additional field of type ADDRESS. Record extensions then require a second block of storage, separately allocated, whose pointer is assigned to that field. Needless to say, the additional indirection in accessing fields of the extension, as well as the cumbersome formulation for breaching the type system, are unsatisfactory. Oberon’s neatly integrated concept of type extensions solves these problems admirably well; its price is a run-time check of type consistency in those cases where it is inherently unavoidable.
Low-level facilities
Facilities that make it possible to express situations which do not properly fit into the set of abstractions constituting the language are called low-level facilities. In a way, their presence is a symptom of the incompleteness of the set of provided abstractions. Given our task of building an entire operating system for Lilith, however, at the time they were unavoidable. I nevertheless believe that they were introduced too light-heartedly, in the naive belief that programmers would use them only as a last resort.
In particular, the concept of type transfer function was a major mistake. It allows the type identifier to be used in expressions as function identifier: The value of T(x) is equal to x, whereby x is interpreted as being of type T. This interpretation inherently depends on the underlying (binary) representation of the type of x and of T. Therefore, every program making use of this facility is inherently implementation-dependent, a clear contradiction of the fundamental goal of high-level languages, i.e. of abstraction. The situation is particularly deplorable, because no hint is given of such dependence in a module’s heading.
Into the same category of easily misused features belongs the variant record. The real stumbling block is the variant without tag field. The tag field’s value is supposed to indicate the structure currently assumed by the record. If a tag is missing, no possibility exists for checking (or determining) the current variant. It is exactly this lack which can be misused to access record fields with “wrong” types:
VAR x: RECORD
CASE : INTEGER OF
0: a: INTEGER
1: b: CARDINAL
2: c: BITSET
END
END
From the assignment x.a := -16 follow the equalities x.b = 65520 and x.c = {4 .. 15}. Or might the latter be x.c = {0 .. 11} ? Needless to say, such programs are ill-defined, hard to understand, and they defy the concept of abstraction and portability.
After 1982, Modula-2 aroused interest in several industrial organizations. The first commercial compiler was built by Logitech S.A., and others followed [11]. IBM produced a compiler and programmed its AS-400 operating system in Modula, and DEC’s Systems Research Center adopted Modula-2 for its internal projects, at the same time extending the language into Modula-2+ [12]. Its creators found Modula-2 lacking in three areas: Concurrency, storage management, and exception handling.
Concurrency
A project to investigate techniques and problems in multiprogramming had been undertaken at our Institute during the years 1974-76. It led to a small, experimental language implemented on the PDP-11 computer [13]. Its basis for multiprogramming were monitors and conditions (here called signals) as proposed by C.A.R. Hoare [14]. Monitors were generalized into modules; the concept of encapsulation was thereby disconnected from that of mutual exclusion. Interrupt handling was integrated into process scheduling in the sense that process switches could be triggered either by programmed signal operations (send, wait) or by interrupts. These were considered as external signals, thereby conceptually unifying declared signals and interrupts.
The project led to the conclusion that there existed no clearly preferrable set of basic constructs for expressing concurrent processes and their cooperation, but that processes and threads, signals and semaphores, monitors and critical regions all have their advantages and disadvantages, depending on particular applications. Hence, it was decided that only the very basic notion of coroutines would be included in Modula-2, and that higher abstractions should be programmed as modules based on coroutines. This decision was even more plausible, because the demands of the Lilith software as a single-user operating system were easily met without sophisticated facilities for multiprogramming, which therefore did not merit attention with high priority. However, the belief that coroutines would also be a suitable basis for genuine multiprocessor implementations was mistaken, as was pointed out by the Modula-2+ authors.
We have also abandoned the belief that interrupt handling should be treated by the same mechanism as programmed process switching. Interrupts are typically subject to specific real-time conditions. Real-time response is impaired beyond acceptable limits, if interrupts are handled by very general, complicated switching and scheduling routines.
Storage management and garbage collection
Modula-2, like Pascal, features pointers and thereby implies dynamic storage allocation. Allocation of a variable x^ is expressed by the intrinsic procedure NEW(x), typically implemented by a system call.
The premise underlying Modula-2 implementation was that programs would handle pools of dynamically allocated variables, one for each (record) type. Such individually programmed handlers could be tuned optimally to the specific conditions of an application, thus guaranteeing the most effective use of storage. This was of great importance at the time, considering the small storage size of computers and the absence of centralized storage management. Today’s very much larger memories render this strategy obsolete and make a flexible, centralized storage management indispensible.
Were it not for the unfortunate and abundant features for breaching the type system, Modula-2 would still satisfy the needs imposed by centralized management, i.e. allow operation on the basis of global garbage collection. A garbage collector must, however, be able to rely on uncorruptible type information at run-time. Type breaching must be impossible. In a way, Modula-2 was too “powerful”. This reminds us of the old wisdom that the power of a language is not only determined by what it allows to express, but equally much by what it prohibits to express.
Exception handling
By handling an exception we generally understand the transfer of control to an implicitly designated statement sequence upon the rare occurrence of a certain condition. What distinguishes, then, “exception handling” from conventional if-statements, except the rarity of the condition? Three items come to mind:
The statement sequence (exception handler) resides in a module different from where the exception is detected (raised).
Several entered procedures need to be terminated exceptionally (aborted). Separating the program text for handling the rare condition from the program specifying the “regular” behaviour is supposed to improve program clarity. Only the second item calls for a facility that is not expressible by conventional constructs. The question then is: Does such a new facility correspond to an important abstraction that truly clarifies programs, or is it merely a convenient abbreviation for frequently occurring situations? And: Is an implementation possible that does not impair efficiency?
The first question can never be answered conclusively. It dependes not only on personal preferences and programming style, but also on the applications considered. A proposal for inclusion of an exception handling construct was already contained in a first draft of Modula-2, but dropped later, because a need for it was not convincingly established. The development of entire operating systems [15] at least showed that the omission of exception handling was not entirely mistaken. The emergence of systems with languages where exception handling was available and – as a consequence – heavily used and frequently even misused, suggests that the decision was even right.
Nevertheless, I am aware that other people – no less experienced in system construction – do not share this view. One fact, however, is undisputed: a feature can always be added to, but never removed from a language. It is therefore recommended to start out with features of established indispensability only.
Later developments
During 1984, the author designed a new compiler for Modula-2. I felt that many parts of compilation could be handled more simply and more efficiently, if use were made of the now available larger store (which, by today’s measures, was still very small). Lilith’s 64K word memory and its high code density allowed the realization of a single-pass compiler. This implied a dramatic reduction of disk operations, which happen to contribute the largest part to compilation time. Indeed, compilation time of the compiler itself was reduced from some 4 minutes to a scant 45 seconds.
The new compiler retained, however, the partitioning of tasks. Instead of each task constituting a pass – with sequential input from and output to disk – it constituted a module with a mostly procedural interface. Common data structures, such as the symbol table, were defined in a data definition module imported by (almost) all other modules. These modules represented a scanner, a parser, a code generator, and a handler of symbol files.
When a new compiler is developed, the temptation to include some changes in the language is seldom resisted. Also in the present case, some clarifications and changes were postulated and incorporated in a revised language definition. The only significant change, however, concerned definition modules (or rather: the definition part of a module). The change postulated that all identifiers declared in a definition part were exported, and this made export lists superfluous.
On the side of the operating system a significant step forward was the implementation of a network. We adopted the experimental Ethernet technology from PARC, a bus topology with a 3 MHz bandwidth and Manchester encoding, yielding a transmission rate of 3 Mbits/s. Both hardware interfaces and the software were developed by J. Hoppe. The services were dominated by file transfer, and therefore the network was connected with the file system at a low level, such that access to both remote and local files was distinguished by a file-name prefix only.
The presence of a network connecting all workstations called for servers, i.e. for “impersonal workstations”. The first server that went into operation in 1982 was a print server connected to a laser printer (Canon LBP-10). Due to the flexibility and power of Lilith, a regular workstation could be used for generating the bitmap and for driving the printer. The hardware interface consisted merely of a direct memory access channel (DMA), effectively similar to a display driver, but suitably adapted to accept synchronization signals from the printer. Since memory was too small to hold an entire page (about 1 Mbit), bitmap generation and data transfer to the printer video had to proceed concurrently. It fortunately turned out that Lilith was sufficiently powerful to allow bitmap generation on-the-fly. Thereby Lilith became the first computer in Europe to make full use of laser printing capabilities, printing texts with various fonts and styles, and graphics ranging from scanned pictures to circuit schematics.
The second server, operational in 1983, was a file server using, in contrast to regular workstations, not only a 10 MByte cartridge disk, but a high-volume (500 MByte) Winchester drive (Fujitsu Eagle), which constituted leading edge-technology at the time. The large size required a different organization of the file system. Also, the concept of stable storage was implemented. This research was performed by F. Ostler.
A sizable effort extending over the years 1981-1985 went into the development of applications. In the forefront were document editors making use of all the novel facilities put at our disposal by Lilith, from bitmapped display and mouse to network and laser printer. The first attempt by J. Gutknecht and W. Winiger led to the editor Andra [16]. It employed the piece list technique for representing the edited text, pioneered by B. Lampson at PARC, as well as windows and pop-up menus for its user interface. It allowed the use of multiple fonts and various formatting modes, and it considered a document as being hierarchically structured. The follow-up project by J. Gutknecht and H. Schar led to the editor Lara [17] which became the chief application program for Lilith users for many years.
Other significant applications were a set of tools for interactive font design, implemented by E. Kohen, and a line drwing graphics editor, developed by the author. The common characteristic of all these tools was that they went into immediate use by our own team, thus providing feedback not only about their own appropriateness, but also on the adequacy of Modula and the entire system.
Lilith and Modula-2 were the backbone tools for education in programming at the Computer Science Department of ETH. About 60 Liliths were in use by 1982, offering a modern environment for programming and for document preparation about five years before similar tools were available commercially. The Lilith computers were decommissioned in 1990; they had been in daily use for 10 years.
References
1. Mesa Language Manual. Oct. 1976, Xerox PARC.
2. N. Wirth. Programming in Modula-2. Springer-Verlag, 1982.
3. J. Gutknecht. Tutorial on Modula-2. BYTE, Aug. 1984, 157-176.
4. C. P. Thacker, et al. Alto: A Personal Computer. CSL-79-11. Xerox PARC.
5. N. Wirth. Lilith: A Personal Computer for the Software Engineer. Proc. 5th Int’l Conf. on Software Engineering. San Diego, 1981. IEEE 81CH1627-9.
6. R. Ohran. Lilith and Modula-2. BYTE, Aug. 1984, 181-192.
7. W. Teitelman. A Tour through Cedar. IEEE Software, 1, 2 (April 1984) 44-73.
8. N. Wirth. The Programming Language Oberon. Software – Practice and Experience, 18, 7 (July 1988), 671-690.
9. M. Reiser and N. Wirth. Programming in Oberon. Addison-Wesley, 1992.
10. N. Wirth. Type Extension. ACM TOPLAS, 10, 2 (April 1988), 204-214.
11. P.H. Hartel and D. Starreveld. Modula-2 Implementation Overview. J. of Pascal, Ada, and Modula-2. July/Aug. 1985, 9-23.
12. P. Rovner. Extending Modula-2 to Build Large, Integrated Systems. IEEE Software, Nov. 1986, 46-57.
13. N. Wirth. Modula: A Language for Modular Multiprogramming. Software – Practice and Experience, 7 (1977), 3-35.
14. C.A.R. Hoare. Monitors: An Operating Systems Structuring Concept. Comm. ACM, 17, 10 (Oct. 1974), 549-557.
15. N. Wirth and J. Gutknecht. Project Oberon. Addison-Wesley, 1992.
16. J. Gutknecht, W. Winiger. Andra: The Document Preparation System of the Personal Workstation Lilith. Software- Practice & Experience, 14, (1984), 73-100.
17. J. Gutknecht. Concepts of the Text Editor Lara. Comm. ACM, 28, 9 (Sept. 1985), 942-960.