drio

Unix: A History and a Memoir

Unix: A History and a Memoir

2025-03-31 22:51:08

Unix and its derivatives aren’t widely known outside a particular technical community,

2025-03-31 22:53:46

but any errors that remain are my fault, at least until I can safely blame them on someone else.

2025-03-31 22:54:12

an especially productive and formative time in the history of computing

2025-03-31 22:58:04

It’s a programming instance of an old strategy: divide and conquer. By breaking bigger tasks into smaller ones, each one becomes more manageable, and the pieces can be combined in unexpected ways.

2025-03-31 22:59:04

The Computing Science Research Center, the fabled “Center 1127,” or just “1127,” was unusually productive for two or three decades.

2025-03-31 23:00:35

joy of creation

2025-03-31 23:05:54

Early in its history, AT&T realized that it needed a research organization that would systematically address the scientific and engineering problems that the company encountered as it tried to provide a national telephone system. In 1925, it created a research and development subsidiary, Bell Telephone Laboratories, to attack these problems.

2025-04-01 22:56:45

Stable funding was a crucial factor for research. It meant that AT&T could take a long-term view and Bell Labs researchers had the freedom to explore areas that might not have a near-term payoff and perhaps never would. That’s a contrast with today’s world, in which planning often seems to look ahead only a few months, and much effort is spent on speculating about financial results for the next quarter.

2025-04-01 22:58:17

They eventually deduced that the noise came from the background radiation that was the residue of the cosmic Big Bang at the beginning of the universe. This discovery led to the 1978 Nobel Prize in physics for Penzias and Wilson. (Arno says that “Most people get Nobels for things they were looking for. We got one for something we were trying to get rid of.”) Another

2025-04-01 23:00:29

hope will give you some idea of the lucky accidents that led me to computing as a career

2025-04-01 23:09:21

Richard Hamming! My friendly next-door neighbor was famous, the inventor of error-correcting codes, and the author of the textbook for a numerical analysis course that I had just taken.

2025-04-01 23:09:46

but I enjoyed his company and over the years profited a great deal from his advice.

2025-04-02 22:27:22

“The purpose of computing is insight, not numbers,”

2025-04-02 22:28:37

“You and Your Research,” which you can find on the web. He gav e the first version of that talk at Bellcore in March 1986; Ken Thompson drove me there so we could hear it. I’ve been recommending the talk to students for decades—it really is worthwhile to read the transcript, or to watch one of the video versions.

2025-04-02 22:31:34

The experience of teaching non-programmers turned out to be good fun. It got me over any fear of public speaking and made it easy to get into a variety of teaching gigs later on.

2025-04-02 22:40:56

When I got to Bell Labs as a permanent employee in 1969, no one told me what I should work on. This was standard practice: people were introduced to other people, encouraged to wander around, and left to find their own

2025-04-02 22:45:10

For the last decade of my time at the Labs, Ken Thompson and Dennis Ritchie’s offices were directly across the corridor from mine.

2025-04-02 22:54:01

For better or worse, I became head of a new department, 11276, with the carefully meaningless name “Computing Structures Research.”

2025-04-02 22:55:04

Understanding what each of them was working on well enough to explain it further up the chain was always a challenge, though it was also rewarding, and a surprising amount of what I learned then has stuck with me.

2025-04-02 22:56:24

The primary annual task for department heads was to assess the work of their department members in an elaborate ritual called “merit review.”

2025-04-02 22:58:23

Writing the assessment and feedback was hard work, and there was a strong tendency to leave the areas for improvement part blank, but one year we were told that it had to be filled in; evasions like leaving it empty or saying “N/A” were no longer acceptable. I came up with the phrase “Keep up the good work,” and got away with that for a year or two before being told that more critical comments were required, on the grounds that no one was perfect. Fortunately I didn’t hav e to do this for a star like Ken Thompson.

2025-04-02 23:00:41

The system did not seem to have 24 CHAPTER 1: BELL LABS much of a bias towards either practice or theory, at least for us in 1127—good programs and good papers were both valued.

2025-04-03 22:50:42

“At some point I realized that I was three weeks from an operating system.” Ken Thompson, Vintage Computer Festival East, May 4, 2019

2025-04-03 22:57:43

Operating systems today are big and complicated programs. Life was simpler in the 1960s but relative to the time, they were still big and complicated.

2025-04-03 22:58:38

further complicate matters, operating systems were written in assembly language, a human-readable representation of machine instructions, but very detailed and specific to the instruction repertoire of a particular kind of hardware. Each kind of computer had its own assembly language, so the operating systems were big and complicated assembly language programs, each written in the specific language of its own hardware

2025-04-03 22:59:35

This lack of commonality among systems and the use of incompatible low-level languages greatly hindered progress because it required multiple versions of programs: a program written for one operating system had to be in effect rewritten from scratch to move to a different operating system or architecture. As we shall see, Unix provided an operating system that was the same across all kinds of hardware, and eventually it was itself written in a high-level language, not assembly language, so it could be moved from one kind of computer to another with comparatively little effort.

2025-04-03 23:00:01

Most operating systems in that era were “batch processing.” Programmers put their programs on punch cards (this was a long time ago!), handed them to an operator, and then waited for the results to be returned, hours or even days later.

2025-04-03 23:00:37

Punch cards were made of stiff high-quality paper and could store up to 80 characters, typically a single line of a program, so the 6-line C program above would require 6 cards,

2025-04-03 23:01:08

By contrast, CTSS programmers used typewriter-like devices (“terminals” like the Model 33 Teletypes in Figure 3.1 in the next chapter) that were connected directly or by phone lines to a single big computer, an IBM 7094 with

2025-04-03 23:02:11

The operating system divided its attention among the users who were logged in, switching rapidly from one active user to the next, giving each user the illusion that they had the whole computer at their disposal. This was called “time-sharing,” and (speaking from personal experience) it was indescribably more pleasant and productive than batch processing. Most of the time, it really did feel like there were no other users.

2025-04-03 23:03:38

the case with Multics. The phrase “over-engineered” appears in several descriptions, and Sam Morgan described it as “an attempt to climb too many trees at once.” Furthermore

2025-04-03 23:05:42

Multics was the source of many really good ideas, but its most lasting contribution was entirely unan-ticipated: its influence on a tiny operating system called Unix that was created in part as a reaction to the complexity of Multics.

2025-04-03 23:08:23

The PDP-7 was first shipped in 1964 and computers were evolving quickly, so by 1969 it was dated.

2025-04-03 23:13:18

Either way, UNICS somehow mutated into Unix, which was clearly a much better idea. (It was rumored that AT&T lawyers did not like “Unics”

2025-04-04 22:46:03

Ken and I retired from Bell Labs at the end of 2000. I went to Princeton, and he joined Entrisphere, a startup founded by Bell Labs colleagues. In 2006, he moved to Google, where with Rob Pike and Robert Griesemer, he created the Go programming language. I heard about his move from

2025-04-04 22:52:07

This was rejected too. A remark from Sam Morgan in Mike Mahoney’s 1989 oral history interview explains some of the reasoning: “The management principles here are that you hire bright people and you introduce them to the environment, and you give them general directions as to what sort of thing is wanted, and you give them lots of freedom. Doesn’t mean that you always necessarily give them all the money that they want. And then you exercise selective enthusiasm over what they do. And if you mistakenly discourage or fail to respond to something that later on turns out to be good, if it is really a strong idea, it will come back.”

2025-04-04 22:52:21

hindsight, being forced to work within constraints was a good thing.

2025-04-04 22:55:41

The deal was approved, a PDP-11 was purchased, and Ken and Dennis quickly converted the PDP-7 version of Unix to run on it. The PDP-11 was a limited machine, with only 24K bytes of primary memory and a half-megabyte disk. The implementation used 16K bytes for the operating system and the remaining 8K for user programs.

2025-04-05 22:58:18

Arguably, one reason why many command names on Unix are short is that it took considerable physical force to type on a Model 33, and printing was slow.

2025-04-05 23:05:20

hard to overstate how important the Unix room was for keeping up with what colleagues were doing, and for creating and maintaining a sense of community.

2025-04-07 23:40:01

Memory was often the most costly component of a computer. When ev ery byte was precious, that scarcity imposed a certain discipline on programmers, who always had to be conscious of how much memory they were using, and sometimes had to resort to trickery and risky programming techniques to fit their programs into the available memory.

2025-04-07 23:41:18

It was only in the mid 1970s that new memory technology based on semiconductors and integrated circuits became widely available at a price where one could afford the moderate but measurable overhead of high-level languages like C.

2025-04-07 23:42:34

When a program failed badly enough, the operating system would notice and would try to help the programmer by producing a file of the contents of main memory—what was in the magnetic cores—from which comes the phrase “core dump,” still used though magnetic cores long ago left the scene. The file is still called core.

2025-04-07 23:45:24

The C programming language dates from early in the 1970s.

2025-04-07 23:46:03

It was based on Dennis’s experience with high-level languages for Multics implementation, but much reduced in size because most computers of the time had limited capacity; there simply wasn’t enough memory or processing power to support a complicated compiler for a complicated language. This enforced minimality matched Ken and Dennis’s preference for simple, uniform mechanisms. C was a good match as well for real computer hardware; it was clear how to translate it into good code that ran efficiently.

2025-04-07 23:46:33

made it possible to write the entire operating system in a high level language

2025-04-07 23:51:42

The first edition of Unix was up and running by late 1971,

2025-04-08 22:43:09

Multics, but was significantly simpler. Its clean, elegant design has over the years become widely used and emulated.

2025-04-08 22:44:07

A Unix file is simply a sequence of bytes. Any structure or organization of the contents of a file is determined only by the programs that process it; the file system itself doesn’t care what’s in a file. That means that any program can read or write any file. This idea seems obvious in retrospect, but it was not always appreciated in earlier systems, which sometimes imposed arbitrary restrictions on the format of information in files and how it could be processed by programs.

2025-04-08 22:46:34

Unix files are organized in directories. (Other operating systems often call these folders.) A Unix directory is also a file in the file system, but one whose contents are maintained by the system itself, not by user programs. A directory contains information about files, which may in turn be directories.

2025-04-08 22:49:04

services like starting and stopping programs, reading or writing information in files, accessing devices and network connections, reporting information like date and time, and many others. These services are implemented within the operating system, and are accessible from running programs through a mechanism called system calls.

2025-04-09 21:08:32

One very important note: the shell is an ordinary user program, not some integral part of the operating system, another idea taken from Multics.

2025-04-09 21:11:40

These scripts are in effect new Unix commands, though highly specialized to me and this particular book. Such personal commands are a common use of shell scripts, a way to create shorthands for one’s own frequent computations. I still use some scripts that I wrote 30 or 40 years ago, and this is not at all unusual among long-time Unix users.

2025-04-09 21:13:17

If you find yourself doing the same sequence of commands over and over again, then you put them into a shell script and thus automate away some drudgery.

2025-04-09 21:17:48

The addition of pipes led to a frenzy of invention that I remember vividly.

2025-04-09 21:18:59

One of mine was based on the who command, which lists the currently logged-in users. A command like who isn’t terribly relevant today when most people work on their own computer, but since the essence of time-sharing was sharing the same computer, it was helpful to know who else was using the system at the same time. Indeed, who added to the sense of community: you could see who was also working, and perhaps get help if you had a problem, even if both parties were at their homes late at night.

2025-04-09 21:22:21

“The genius of the Unix pipeline is precisely that it is constructed from the very same commands used constantly in simplex fashion. The mental leap needed to see this possibility and to invent the notation is large indeed.”

2025-04-10 22:00:53

The name grep comes from a command in the ed text editor, g/re/p, that prints all lines that match the regular expression pattern re; the Oxford English Dictionary entry for grep

2025-04-10 22:04:03

In effect, a regular expression is a small language for describing text patterns.

2025-04-10 22:06:14

Although wildcards are interpreted by the shell, because primary memory on the PDP-7 was so limited, the first implementation was a separate program called glob (for “global”) called by the shell, and the act of generating an expanded list of filenames from a pattern was called “globbing.”

2025-04-10 22:08:27

This is a common Unix story: a real problem from a real user, deep knowledge of relevant theory, effective engineering to make the theory work well in practice, and continuous improvement. It all came together because of broad expertise in the group, an open environment, and a culture of experimenting with new ideas.

2025-04-10 22:14:04

pointer is a value corresponding to an address, that is, a location in primary memory, and it has 78 CHAPTER 4: SIXTH EDITION a type, the type of the objects that it will point to. If that location corresponds to an element of an array of that particular type of object, then in C, adding 1 to the pointer yields the address of the next element of the array.

2025-04-10 22:15:19

Ken tried three times in 1973 to write the kernel in C but it proved too difficult until Dennis added a mechanism for defining and processing nested data structures (struct) to the language. At that point, C was sufficiently expressive for writing operating system code, and Unix became mostly a C program. The 6th edition kernel has about 9,000 lines of C and about 700 lines of assembly language for machine-specific operations like setting up registers, devices and memory mapping.

2025-04-10 22:17:14

wrote the first drafts of most of the tutorial material originally, but Dennis wrote the chapter on system calls, and of course provided the reference manual. We made many alternating passes over the main text, so that’s a blend of our styles, but the reference manual stayed almost exactly as it had been, a pure example of Dennis’s writing. It describes the language with what Bill Plauger once called “spine-tingling precision.” The reference manual is like C itself: precise, elegant, and compact.

2025-04-10 22:20:44

“C is quirky, flawed, and an enormous success.

2025-04-10 22:21:07

While accidents of history surely helped, it evidently satisfied a need for a system implementation language efficient enough to displace assembly language, yet sufficiently abstract and fluent to describe algorithms and interactions in a wide variety of environments.”

2025-04-11 22:16:19

By mid to late 1975, Unix had been publicly described at conferences and in journal papers and the 6th edition was in use at perhaps a hundred universities and a limited number of commercial operations.

2025-04-11 22:19:49

Ratfor was the first example of a language that based its syntax on C.

2025-04-11 22:20:15

Writing Fortran in Ratfor was, if I do say so myself, infinitely more pleasant than writing standard Fortran. Ratfor didn’t change Fortran semantics or data types—it had no features for processing characters, for instance—but for anything where Fortran would be a good choice, Ratfor made it better. Free-form input and C-like control flow made it feel almost like writing in C.

2025-04-11 22:25:13

Ken Thompson says that Doug is smarter than everyone else, which also seems accurate, though Doug himself says “I’ll leave to others to assess how smart I may be, but I know that many of BTL’s practicing mathematicians were much smarter.” Suffice it to say that there were many outstanding people at the Labs, the imposter syndrome was not unknown, and one was continuously stretched trying to keep up.

2025-04-12 23:10:14

No matter who is right here, Unix might not have existed, and certainly would not have been as successful, without Doug’s good taste and sound judgment of both technical matters and people.

2025-04-12 23:16:16

Doug was usually the first person to read drafts of papers or manuals, where he would deftly puncture rhetorical balloons, cut flabby prose, weed out unnecessary adverbs, and generally clean up the mess.

2025-04-12 23:18:29

In this chapter, we’ll see several threads of software development in 1127 that culminated in the 7th edition, which was released in January, 1979, nearly four years after the 6th edition.

2025-04-12 23:22:01

The control-flow syntax of the new shell was unusual, since it was based on Algol 68, a language favored by Steve though not many others in 1127.

2025-04-12 23:22:28

For example, Algol 68 used reversed words as terminators, like fi to terminate if and esac to terminate case. But since od was already taken (for the octal dump command), do was terminated by done.

2025-04-13 21:48:44

We use language to communicate, and better languages help us to communicate more effectively.

2025-04-13 21:50:06

A good language lowers the barrier between what we want to say (“just do it”) and what we have to say to get some job done. A great deal of research in computing is concerned with how to create expressive languages.

2025-04-13 21:51:29

Computer languages are characterized by two main aspects, syntax and semantics.

2025-04-13 21:51:34

Syntax describes the grammar: what the language looks like, what’s grammatically legal and what’s not. The syntax defines the rules for how statements and functions are written, what the arithmetic and logical operators are, how they are combined into expressions, what names are legal, what words are reserved, how literal strings and numbers are expressed, how programs are formatted, and so on.

2025-04-13 21:52:40

Semantics is the meaning that is ascribed to legal syntax: what does a legal construction mean or do.

2025-04-13 21:53:28

compiler is a program that translates something written in one language into something semantically equivalent in another language. For example, compilers for high-level languages like C and Fortran might translate into assembly language for a particular kind of computer; some compilers translate from other languages, such as Ratfor into Fortran.

2025-04-13 21:55:43

expression := expression + expression expression := expression * expression and the corresponding semantic actions might be to generate code that would add or multiply the results of the two expressions together and make that the result. Yacc converts this specification into a C program that parses input and performs the semantic actions as the input is being parsed.

2025-04-13 21:57:48

“An unexpected spin-off from PCC was a program called Lint. It would read your program and comment on things that were not portable, or just plain wrong, like calling a function with the wrong number of arguments, inconsistent sizes between definition and use, and so on. Since the C compiler only looked at one file at a time, Lint quickly became a useful tool when writing multi-file programs. It was also useful in enforcing standards when we made V7 portable, things like looking for system calls whose error return was −1 (Version 6) instead of null (V7). Many of the checks, even the portability checks, were ev entually moved into the C language itself; Lint was a useful test-bench for new features.”

2025-04-13 21:59:33

Today it lives on under its own name, in independent implementations like Bison that are derived from it, and in reimplementations in half a dozen other languages.

2025-04-13 22:01:10

“Lex was rewritten almost immediately by Eric Schmidt as a summer student. I had written it with a non-deterministic analyzer that couldn’t handle rules with more than 16 states. Al Aho was frustrated and got me a summer student to fix it. He just happened to be unusual.” Eric went on to a PhD at Berkeley, and was the CEO of Google from 2001 to 2011.

2025-04-13 22:01:27

Yacc and Lex work well together. Each time Yacc needs the next token while parsing, it calls on Lex, which reads enough input to identify a complete token and passes that back to Yacc.

2025-04-13 22:05:30

This is a good example of a general rule: if a program writes your code for you, the code will be more correct and reliable than if you write it yourself by hand. If the generator is improved, for example to produce better code, everyone benefits; by contrast, improvements to one hand-written program do not improve others.

2025-04-13 22:05:43

As Doug McIlroy says, “Anything you have to do repeatedly may be ripe for automation.”

2025-04-13 22:06:42

Steve Johnson complained about this to Stu Feldman (Figure 5.5) one day, after spending hours of fruitless debugging, only to realize that he had simply failed to recompile one of the files he had changed.

2025-04-13 22:07:28

He came up with a elegant idea, a specification language that describes how the pieces of a program depend on each other.

2025-04-13 22:09:30

One makefile could capture all the processing steps necessary to compile a new version of a program, and could also describe how to do related tasks like running Lint, making a backup and printing documentation.

2025-04-13 22:10:23

It’s also a nice example of a general problem that any successful program faces: if the program is good, it attracts users, and then it becomes hard to change the program in any incompatible way. Unix and most other systems are replete with examples of initial blemishes that are now too entrenched to fix.

2025-04-13 22:11:09

: rather than writing code or doing sequences of operations by hand, create a notation or 98 CHAPTER 5: SEVENTH EDITION specification that declares what has to be done, and write a program to interpret the specification. This approach replaces code with data, and that’s almost always a win

2025-04-14 21:52:42

Like so much of Unix, it’s a story of how the interactions among programs, programmers and users formed a virtuous cycle of innovations and improvements.

2025-04-15 21:33:02

was typographically rough in many places because we did not have a monospace font for displaying programs, but it was otherwise satisfactory.

2025-04-15 21:41:20

How did this relatively small group of researchers from industry manage to produce so many influential books? I can see several reasons. First and foremost, people took writing seriously, they took pains with their own writing, and they were great critical readers of what other people wrote.

2025-04-15 21:42:19

Of course Doug wasn’t the only critical reader. Everyone gav e generously of their time; it was simply part of the culture that you provided detailed comments on what your colleagues wrote. This was unusual, and was one of the things that made the Labs a great place to be.

2025-04-16 21:46:44

One of the major simplifications of the Unix file system was its uniform treatment of files as sequences of uninterpreted bytes. There were no records, no required or prohibited characters, and no internal structure imposed by the file system—just bytes

2025-04-16 21:47:48

There was a similar simplification in the way that most Unix programs handled textual data. Te xt files were just sequences of bytes that happened to be characters in ASCII, the American Standard Code for Information Inter-change.

2025-04-16 21:55:31

The pattern-action paradigm is a natural way to think about computations that are primarily a sequence of tests and actions.

2025-04-16 21:56:44

The real lesson is that wide-ranging interests, language expertise, and tools like 118 CHAPTER 5: SEVENTH EDITION Yacc and Lex made it possible for members of the Center to create new languages for new application areas relatively easily.

2025-04-16 21:58:11

I got to see the evolution of C++ from the beginning. At least in the early days I understood it, but it’s now a much bigger language and I’m barely literate in it

2025-04-16 21:59:17

++ is often criticized for its size, and sometimes for some of the syntax that it inherited from C. I know from years of conversations that there isn’t anything in the language for which Bjarne didn’t hav e a good reason. It also was a sound engineering and marketing decision to make C++ a superset of C, even though that required including many of C’s syntactic and semantic rough spots. If Bjarne had not aimed at C compatibility, C++ would have had much less chance of success. It’s hard to establish a new language; making it compatible at both source level (for cultural familiarity) and object level (use of existing C libraries) was crucial, and at the time so was making it as efficient as C

2025-04-16 22:02:32

Spin can be used to verify that a particular system is logically correct, free of defects like deadlock where no progress can be made. (“After you.” “No, after you.”)

2025-04-16 22:05:35

In the early 1980s, Rob Pike (Figure 5.18) and Luca Cardelli experimented with languages for concurrency, especially for interactions with input devices like mice and keyboards; that led to the names Squeak and Newsqueak. The ideas from Newsqueak eventually found their way into the concurrent languages Limbo and Alef that were used in Plan 9, and a decade later into the Go programming language, which was created at Google in 2008 by Rob Pike, Ken Thompson and Robert Griesemer.

2025-04-16 22:06:58

might be expected for a scientific research operation, Bell Labs was involved very early in the use of computers for modeling and simulation of physical systems and processing, a natural extension of mathematical research

2025-04-16 22:18:13

When Ken and Dennis won the Turing Award in 1983, Ken’s prescient talk, “Reflections on Trusting Trust,” explained a series of modifications that he could make to a compiler that would eventually install a Trojan horse in the login program for a system.

2025-04-16 22:18:07

“You can’t trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code.”

2025-04-16 22:18:31

he noted, the same kinds of tricks can be applied to hardware, where they will be even harder to discover. Things have not gotten better in the interim, and the paper is still highly relevant today.

2025-04-16 22:22:17

The 5620 was a good graphics terminal, though physically heavy and bulky. I used it to write graphical programs such as a Troff previewer. It was also the environment in which Rob Pike wrote a series of mouse-based text editors, one of which I use by preference even today: this book was written with Sam.

2025-04-16 22:23:36

At the time, Bell Labs was using state of the art 3.5 micron technology; today’s circuits are usually 7 to 10 nanometers, an improvement of at least 300 in line widths, and thus about 100,000 in the number of devices in a given area.

2025-04-18 22:08:30

1973, AT&T began licensing Unix to universities for a nominal fee, though most licenses were for the 6th edition, which became available in 1975

2025-04-18 22:07:50

One of the most active license recipients was the University of California at Berkeley (UCB), where a number of graduate students made major contributions to the system that eventually became the Berkeley Software Distribution (BSD), one of two main branches that evolved from the original Research Unix.

2025-04-18 22:09:13

Bill Joy (Figure 6.2), a grad student at the time, modified the local version of Unix, and added some programs of his own, including the vi text editor, which is still one of the most popular Unix editors, and the C shell csh. Bill later designed the TCP/IP networking interface for Unix that is still used today. His socket interface made it possible to read and write network connections with the same read and write system calls as were used for file and device I/O, so it was easy to add networking functionality.

2025-04-18 22:11:42

don’t recall what I said at the time about the editor itself (though today vi is one of the two editors that I use most often), but I do remember telling Bill that he should stop fooling around with editors and finish his PhD. Fortunately for him and for many others, he ignored my advice. A few years later, he dropped out of graduate school to co-found Sun Microsystems, one of the first workstation companies, with software based on Berkeley Unix, 136 CHAPTER 6: BEYOND RESEARCH including his fundamental work on the system, networking and tools (and his vi editor). I often cite this story when students ask me for career advice— older is not always wiser

2025-04-18 22:15:06

The first Unix user group meeting was held in New York in 1974, and user groups gradually sprang up all over the world. In

2025-04-18 22:15:47

The user groups evolved into an umbrella organization called “Unix User Groups,” which was renamed USENIX (Usenix from now on) after AT&T complained about mis-use of the Unix trademark. Usenix now runs an extensive series of professional conferences and publishes a technical journal called “;login:”. Usenix played a significant role in spreading Unix, with conference presentations and tutorials on many subjects. It also distributed UUCP and ran the Usenet news system.

2025-04-18 22:19:36

far as I know, Ken always subscribed fully to the idea that good code doesn’t need many comments, so by extrapolation, great code needs none at all;

2025-04-18 22:21:38

“The cooperation of the nroff program must also be mentioned. Without it, these notes could never hav e been produced in this form. Howev er it has yielded some of its more enigmatic secrets so reluctantly, that the author’s gratitude is indeed mixed. Certainly nroff itself must provide a fertile field for future practitioners of the program documenter’s art.”

2025-04-18 22:25:35

The workstation marketplace arose because technological improvements made it possible to pack serious computing horsepower into a small physical package and sell it for a modest price. The complete system price could be reasonable in part because software, including the operating system, was already available. There was no need for new manufacturers to create a new operating system—it was enough to port Unix and its accompanying programs to whatever processor the computer used. The workstation market was thus helped significantly by the availability of Unix.

2025-04-20 22:41:10

It had been argued that AT&T was prohibited from selling Unix commercially because as a regulated public monopoly, if it did so, it would be competing with other operating system vendors, using revenues from telephone services to cross-subsidize Unix development.

2025-04-20 22:46:13

Beginning in 1984 USL marketed Unix aggressively, and worked hard to make it a professional commercial product. The culmination was a version called System V Release 4, or SVR4.

2025-04-21 17:53:43

people inside Bell Labs were required to use the name correctly. In particular, it could not be used as a standalone noun (“Unix is an operating system”). It had to be identified as a trademark and also had to appear as an upper-case adjective in the phrase “the UNIX™ operating system,” which led to awkward sentences like “The UNIX™ operating system is an operating system.” Rob Pike and I had to fight this battle over the title

2025-04-21 18:00:49

There were internal versions like PWB that supported the Programmer’s Workbench tools, of course, but starting in 1975, external versions appeared as well, originally based on the 6th edition, and then later on the 7th edition, which appeared in 1979. The 7th edition was the last Research version of Unix to be released and widely used. Three more editions were developed and used internally (predictably called 8th, 9th, and 10th) but by the time the 10th edition was completed in late 1989, it was clear that the center(s) of gravity of Unix development had moved elsewhere.

2025-04-21 18:01:21

Tw o threads evolved from the 7th edition, one from Berkeley that built on the work of Bill Joy and colleagues, and another from AT&T as it tried to build a money-making business out of its Unix expertise and ownership.

2025-04-21 18:03:51

11/780. The VAX was a 32-bit machine with substantially more memory and computing horsepower than the PDP-11

2025-04-21 18:04:01

Research Group at the University of California, Berkeley, started with Reiser and London’s 32/V and added code to use virtual memory. This version quickly supplanted 32/V and the VAX itself became the primary Unix machine for most users as they outgrew the PDP-11. The Berkeley version was packaged and shipped to Unix licensees as BSD, the Berkeley Software Distribution. BSD descen-dants are still active, with variants like FreeBSD, OpenBSD and NetBSD all continuing development. NextSTEP, used for Apple’s Darwin, the core of macOS, was also a BSD derivative.

2025-04-21 18:05:29

base of SunOS, which was used on computers from Sun Microsystems, co-founded by Bill Joy. Others spun off a few years later into the BSD variants mentioned above. All of these eventually were reimplementations that provided the same functionality but with entirely new code. Once rewritten, they were free of AT&T code and thus did not infringe AT&T’s intellectual property. Another spinoff

2025-04-21 18:05:52

was created for NeXT, which was founded by Steve Jobs in 1985. The NeXT workstation had a variety of innovative features, and was an early example of the elegant and polished industrial design that Apple users are familiar with. I was in the audience at Bell Labs on December 11, 1990, when Jobs gav e a demonstration of the NeXT. It was a very nice machine, and it was the only time that I can recall thinking “I want one of those” about any technological gadget. I had obviously been seduced by the famous Jobs “reality distortion field.” When he did another presentation at the Labs three years later, there was no such effect, and I don’t even remember what he was showing off.

2025-04-21 18:06:59

The timeline reveals another little known fact: in the 1980s Microsoft distributed a version of Unix called Xenix;

2025-04-21 18:07:17

One wonders how different the world would be today if 156 CHAPTER 8: DESCENDANTS Microsoft had pushed Xenix instead of its own MS-DOS, and

2025-04-21 18:09:46

In the late 1980s there were numerous vendors of Unix systems, all using the trademarked name, and purveying software at least originally based on Version 7 from Bell Labs research. There were incompatibilities, however, particularly between AT&T’s System V and the Berkeley distributions. All parties agreed that it would be highly desirable to have a common standard, but naturally disagreed on what it would be. X/Open, an industry consortium, was formed in 1984 to try to create a standard source-code environment so that programs could be compiled on CHAPTER 8: DESCENDANTS 157 any Unix system without change.

2025-04-21 18:10:24

Berkeley had made many changes in the AT&T code, and added much valuable material of their own, including the TCP/IP code that made the Internet accessible.

2025-04-21 18:13:51

AT&T’s licensing of Unix became more and more restrictive as the company tried to make money from the software. This included restrictions on how Unix could be used in universities, which gav e an advantage to BSD, which had no such constraints. At the same time, the ongoing wars between AT&T and BSD encouraged others to try rolling their own Unix-like systems. Independently created versions were free of commercial restrictions, since they used only the system call interface, but no one else’s code.

2025-04-21 18:17:15

Disclosure: I’ve signed on to a couple of amicus briefs on the Google side here, since I do not believe that APIs should be copyrightable. If they were, we would not have had any of the Unix lookalikes, including Linux, since they are based on independent implementation of the Unix system call interface.

2025-04-21 22:32:45

Ken Thompson, Rob Pike, Dave Presotto and Howard Trickey—gathered together to work on a new operating system, which they called Plan 9 from Bell Labs after the 1959 science-fiction movie Plan 9 from Outer Space.

2025-04-21 22:33:46

Plan 9 the operating system was in part an attempt to take the good ideas in Unix and push them further. For example, in Unix, devices were files in the file system. In Plan 9, many more data sources and sinks were files as well, including processes, network connections, window-system screens and shell environments. Plan 9 also aimed for portability right from the beginning, with a single source that could be compiled for any supported architecture. Another outstanding feature of Plan 9 was its support for distributed systems. Processes and files on unrelated systems with different architectures could work together exactly as if they were in the same system.

2025-04-21 22:37:33

Ken Thompson and Rob Pike wrestled with this issue for Plan 9, since they had decided that Plan 9 would use Unicode throughout, not ASCII. In September 1992, they came up with UTF-8, a clever variable-length encoding of Unicode. UTF-8 is efficient in both space and processing time. It represents each ASCII character as a single byte, and uses only two or three bytes for most other characters, with a maximum of four bytes. The encoding is compact, and ASCII is legal UTF-8. UTF-8 can be decoded as it is read, since no legal character is a prefix of any other character, nor is any character part of any other character or sequence of characters. Almost all text on the Internet today is encoded in UTF-8; it is used everywhere by ev eryone.

2025-04-21 22:40:34

Lucent went through a boom and then a bust, with some questionable business practices en route. As it struggled to survive, it spun off its enterprise communications services business into a company called Avaya in 2000, and its integrated-circuit business into another called Agere in 2002.

2025-04-22 22:20:07

The important technical ideas from Unix have been discussed in the first few chapters of the book; this section is a brief summary. Not everything here originated with Unix, of course; part of the genius of Ken Thompson and Dennis Ritchie was their tasteful selection of existing good ideas, and their ability to see a general concept or a unifying theme that simplified software systems. People sometimes talk of software productivity in terms of the number of lines of code written; in the Unix world, productivity was often measured by the number of special cases or lines of code that were removed.

2025-04-22 22:21:24

Naturally, there are some irregularities. Devices appear in the file system, which is a simplification, but operations on them, especially terminals, have special cases and an interface that remains messy even today.

2025-04-22 22:22:01

The brilliance of Unix was in choosing an abstraction that was general enough to be remarkably useful, yet not too costly in performance.

2025-04-23 21:50:47

One of the reasons there were so many languages is the development of tools that enabled non-experts to create them. Yacc and Lex are the primary examples here, and they are specialized languages in their own right.

2025-04-23 21:54:42

The Unix philosophy certainly doesn’t solve all the problems of programming, but it does provide a useful guide for approaching system design and implementation.

2025-04-23 21:55:26

believe that another large component of the success of Unix was due to non-technical factors, like the managerial and organizational structure of Bell Labs, the social environment of 1127, and the flow of ideas across a group of talented people working on diverse problems in a collegial environment.

2025-04-23 21:56:34

The long-term Bell Labs goal of continuously improving telephone service meant that researchers could explore ideas that they thought were important, for long periods, even years, without having to justify their efforts every few months. There

2025-04-23 21:58:11

Certainly at some level someone did worry about such matters, but not the people doing the research. There were no research proposals, no quarterly progress reports, and no need to seek management approval before working on something.

2025-04-23 21:59:09

There were CHAPTER 9: LEGACY 171 occasional periods when travel was scrutinized more carefully—we might be limited to one or two conferences a year, perhaps—but for the most part, if we needed to buy equipment or make a trip, money was available without much question.

2025-04-23 22:02:16

Secretive companies had a harder time attracting talent, something that appears to still be true today.

2025-04-23 22:10:54

Fun. It’s important to enjoy your work and the colleagues that you work with. 1127 was almost always a fun place to be, not just for the work, but the esprit of being part of a remarkable group. Since there were no local options other than the company cafeteria, lunchtime provided a mix of social and 174 CHAPTER 9: LEGACY technical discussions.

2025-04-24 21:56:57

By far the most elaborate prank was played on Arno Penzias by a team of at least a dozen, led by Rob Pike and Dennis Ritchie, with the aid of professional magicians Penn and Teller. It’s too long for the book, but Dennis tells the Labscam story at www.bell-labs.com/usr/dmr/www/labscam.html, and the video is at www.youtube.com/watch?v=if9YpJZacGI. I’m in the credits at the end as a gaffer, which is accurate—much duct tape was involved.

2025-04-24 21:58:32

It takes effort to build and maintain an organization whose members like and respect each other, and who enjoy each other’s company. This can’t be created by management fiat, nor by external consultants; it grows organically from the enjoyment of working together, sometimes playing together, and appreciating what others do well.

2025-04-24 21:59:41

“The success of the Unix system stems from its tasteful selection of a few key ideas and their elegant implementation. The model of the Unix system has led a generation of software designers to new ways of thinking about programming. The genius of the Unix system is its framework, which enables programmers to stand on the work of others.”

2025-04-24 22:02:16

“Unix was also a major driving force behind the development of the Internet. University of California, Berkeley dev eloped Berkeley Software Distribution (BSD), an extended version of Unix that was implemented with the Internet protocol suite TCP/IP. The development was based on the sixth edition of Unix that Bell Labs distributed along with its source code to universities and research institutions in 1975, which led to the beginning of an ‘open source’ culture. BSD Unix helped the realization of the Internet.”

2025-04-24 22:05:16

For example, the number of people contributing to Unix in the early days was tiny; arguably the core was a single person, Ken Thompson, who is certainly the best programmer I have ever met, and an original thinker without peer. Dennis Ritchie, whose name is linked with Ken as the co-creator of Unix, was a vital contributor, and his C programming language, central to the 178 CHAPTER 9: LEGACY ev olution of Unix in the early days, is still the lingua franca of computing.

2025-04-24 22:07:49

The big secret to doing good research is to hire good people, make sure there are interesting things for them to work on, take a long view, and then get out of the way. It certainly wasn’t perfect, but Bell Labs research generally did this well. Of course computing didn’t exist in a technological vacuum. The invention of the transistor and then integrated circuits meant that for 50 years computing hardware kept getting smaller, faster, and cheaper at an exponential rate. As hardware got better, software became easier, and our understanding of how to create it got better as well. Unix rode the technology improvement wave, as did many other systems.

2025-04-24 22:14:33

The May 2019 fireside chat with Ken Thompson at the Vintage Computer Festival East is on YouTube at www.youtube.com/watch?v=EY6q5dv_B-o.