Computerworld

Unix turns 40: The past, present and future of the OS

In August 1969, Ken Thompson, a programmer at AT&T Bell Laboratories, saw the monthlong absence of his wife and young son as an opportunity to put his ideas for a new operating system into practice

Forty years ago this summer, a programmer sat down and knocked out in one month what would become one of the most important pieces of software ever created.

In August 1969, Ken Thompson, a programmer at AT&T Bell Laboratories, saw the monthlong absence of his wife and young son as an opportunity to put his ideas for a new operating system into practice. He wrote the first version of Unix in assembly language for a wimpy Digital Equipment Corp. PDP-7 minicomputer, spending one week each on the operating system, a shell, an editor and an assembler.

Thompson and a colleague, Dennis Ritchie , had been feeling adrift since Bell Labs had withdrawn earlier in the year from a troubled project to develop a time-sharing system called Multics, short for Multiplexed Information and Computing Service. They had no desire to stick with any of the batch operating systems that predominated at the time, nor did they want to reinvent Multics, which they saw as grotesque and unwieldy.

After batting around some ideas for a new system, Thompson wrote the first version of Unix, which the pair would continue to develop over the next several years with the help of colleagues Doug McIlroy, Joe Ossanna and Rudd Canaday. Some of the principles of Multics were carried over into their new operating system, but the beauty of Unix then (if not now) lay in its "less is more" philosophy.

"A powerful operating system for interactive use need not be expensive either in equipment or in human effort," Ritchie and Thompson would write five years later in the Communications of the ACM (CACM), the journal of the Association for Computing Machinery. "[We hope that] users of Unix will find that the most important characteristics of the system are its simplicity, elegance, and ease of use."

So What Is 'Unix,' Anyway?

Unix, most people would say, is an operating system written decades ago at AT&T's Bell Labs, and its descendents. Today's major versions of Unix branched off a tree with two trunks: one emanating directly from AT&T and one from AT&T via the University of California, Berkeley. The stoutest branches today are AIX from IBM, HP-UX from Hewlett-Packard and Solaris from Sun Microsystems.

However, The Open Group, which owns the Unix trademark, defines Unix as any operating system it has certified as conforming to the Single Unix Specification (SUS). This includes operating systems that are usually not thought of as Unix, such as Mac OS X Leopard (which descended from BSD Unix) and IBM's z/OS (which descended from the mainframe operating system MVS), because they conform to the SUS and support SUS APIs. The basic idea is that it is Unix if it acts like Unix, regardless of the underlying code.

A still broader definition of Unix would include Unix-like operating systems -- sometimes called Unix "clones" or "look-alikes" -- that copied many ideas from Unix but didn't directly incorporate code from Unix. The leading one of these is Linux.

Finally, although it's reasonable to call Unix an "operating system," as a practical matter it is more. In addition to an OS kernel, Unix implementations typically include utilities such as command-line editors, APIs, development environments, libraries and documentation.

Apparently, they did. Unix would go on to become a cornerstone of IT, widely deployed to run servers and workstations in universities, government facilities and corporations. And its influence spread even further than its actual deployments, as the ACM noted in 1983 when it gave Thompson and Ritchie its top prize, the A.M. Turing Award for contributions to IT: "The model of the Unix system has led a generation of software designers to new ways of thinking about programming."

Of course, Unix's success didn't happen all at once. In 1971, it was ported to the PDP-11 minicomputer, a more powerful platform than the PDP-7. Text-formatting and text-editing programs were added, and it was rolled out to a few typists in the Bell Labs patent department, its first users outside the development team.

In 1972, Ritchie wrote the high-level C programming language (based on Thompson's earlier B language); subsequently, Thompson rewrote Unix in C, greatly increasing the operating system's portability across computing environments. Along the way, it picked up the name Unics (Uniplexed Information and Computing Service), a play on Multics; the spelling soon morphed into Unix.

It was time to spread the word. Ritchie and Thompson's July 1974 CACM article, "The UNIX Time-Sharing System," took the IT world by storm. Until then, Unix had been confined to a handful of users at Bell Labs. But now, with the Association for Computing Machinery behind it -- an editor called it "elegant" -- Unix was at a tipping point.

"The CACM article had a dramatic impact," IT historian Peter Salus wrote in his book The Daemon, the Gnu and the Penguin (Reed Media Services, 2008). "Soon, Ken was awash in requests for Unix."

Page Break

Hackers' Heaven

Thompson and Ritchie were consummate "hackers," when that word referred to someone who combined creativity, brute-force intelligence and midnight oil to solve software problems that others barely knew existed.

Their approach, and the code they wrote, greatly appealed to programmers at universities, and later at start-up companies without the megabudgets of an IBM, a Hewlett-Packard or a Microsoft. Unix was all that other hackers, such as Bill Joy at the University of California, Berkeley, Rick Rashid at Carnegie Mellon University and David Korn later at Bell Labs, could wish for.

"Nearly from the start, the system was able to, and did, maintain itself," wrote Thompson and Ritchie in the CACM article. "Since all source programs were always available and easily modified online, we were willing to revise and rewrite the system and its software when new ideas were invented, discovered, or suggested by others."

Korn, an AT&T Fellow today, worked as a programmer at Bell Labs in the 1970s. "One of the hallmarks of Unix was that tools could be written, and better tools could replace them," he recalls. "It wasn't some monolith where you had to buy into everything; you could actually develop better versions." He developed the influential Korn shell, essentially a programming language to direct Unix operations that's now available as open-source software.

Author and technology historian Salus recalls his work with the programming language APL on an IBM System/360 mainframe as a professor at the University of Toronto in the 1970s. It was not going well. But on the day after Christmas in 1978, a friend at Columbia University gave him a demonstration of Unix running on a minicomputer. "I said, 'Oh my God,' and I was an absolute convert," says Salus.

Users: Unix Has a Healthy Future

If you're among those predicting the imminent demise of Unix, you might want to reconsider. Computerworld's 2009 Unix survey of IT executives and managers, conducted online in March and April, tells a different story: While demand appears to be down from our 2003 survey on Unix use, the operating system is clearly still going strong.

Of the 211 respondents, 130 (62%) reported using Unix in their organizations. Of the 130 respondents whose companies use Unix, 69% indicated that their organizations are "extremely reliant" or "very reliant" on Unix, with another 21% portraying their organizations as "somewhat reliant" on Unix.

Why are IT shops still so reliant on Unix? Applications and reliability/scalability (64% and 51%, respectively) were the main reasons cited by respondents. Other reasons included cost considerations, hardware vendors, ease of application integration/development, interoperability, uptime and security.

AIX was the most commonly reported flavor of Unix used by the survey base (42%), followed by Solaris/Sparc (39%), HP-UX (25%) and Solaris/x86 (22%), "other Unix flavors/versions" (19%), Mac OS X Server (12%) and OpenSolaris (10%). Of the 19% who selected other Unix flavors, most said they used some kind of Linux.

Almost half of the respondents (47%) predicted that in five years, Unix will still be "an essential operating system with continued widespread deployment." Just 5% envisioned it fading away. Of those who said they were planning on migrating away from Unix, cost was the No. 1 reason, followed by server consolidation and a skills shortage.

Which of the following best describes your Unix strategy?

* Unix is an essential platform for us and will remain so indefinitely: 42%

* Unix's role in our enterprise will shrink, but it won't disappear: 18%

* We are increasing our use of Unix: 15%

* We expect to migrate away from Unix in the future: 12%

* None of the above: 8%

* We have already implemented a plan to migrate away from Unix: 5%

* Other: 2%

Which of the following best describes your vision of where Unix will be in five years?

* It will be an essential operating system with continued widespread deployment: 47%

* It will be important in some vertical market sectors, but it will not be considered an essential operating environment for most companies: 35%

* It will generally be seen as a legacy system warranting a non-Unix migration path: 11%

* Unix, as well as other operating systems, will fade in importance as we go to hosted (cloud, software-as-service, etc.) systems: 5%

* None of the above: 2%

* Other: 1%

Base: 130 IT managers who said their companies use Unix. Percentages do not add up to 100 because of rounding.

Source: Computerworld 2009 Unix Survey

Page Break

He says the key advantage of Unix for him was its "pipe" feature, introduced in 1973, which made it easy to pass the output of one program to another. The pipeline concept, invented by Bell Labs' McIlroy, was subsequently copied by many operating systems, including all the Unix variants, Linux, DOS and Windows.

Another advantage of Unix -- the second "wow," as Salus puts it -- was that it didn't have to be run on a million-dollar mainframe. It was written for the tiny and primitive DEC PDP-7 minicomputer because that's all Thompson and Ritchie could get their hands on in 1969. "The PDP-7 was almost incapable of anything," Salus recalls. "I was hooked." Unix Offspring

A lot of others got hooked as well. University researchers adopted Unix in droves because it was relatively simple and easily modified, it was undemanding in its resource requirements, and the source code was essentially free. Start-ups like Sun Microsystems Inc. and a host of now-defunct companies that specialized in scientific computing, such as Multiflow Computer, made it their operating system of choice for the same reasons.

Unix grew up as a nonproprietary system because in 1956, AT&T had been enjoined by a federal consent decree from straying from its mission to provide telephone service. It was OK to develop software, and even to license it for a "reasonable" fee, but the company was barred from getting into the computer business.

Unix, which was developed with no encouragement from management, was first viewed at AT&T as something between a curiosity and a legal headache.

Then, in the late 1970s, AT&T realized it had something of commercial importance on its hands. Its lawyers began adopting a more favorable interpretation of the 1956 consent decree as they looked for ways to protect Unix as a trade secret. Beginning in 1979, with the release of Version 7, Unix licenses prohibited universities from using the Unix source code for study in their courses.

No problem, said computer science professor Andrew Tanenbaum, who had been using Unix v6 at Vrije Universiteit in Amsterdam. In 1987, he wrote a Unix clone for use in his classrooms, creating the open-source Minix operating system to run on the Intel 80286 microprocessor.

"Minix incorporated all the ideas of Unix, and it was a brilliant job," Salus says. "Only a major programmer, someone who deeply understood the internals of an operating system, could do that." Minix would become the starting point for Linus Torvalds' 1991 creation of Linux -- if not exactly a Unix clone, certainly a Unix look-alike.

Stepping back a decade or so, Bill Joy, who was a graduate student and programmer at UC Berkeley in the '70s, got his hands on a copy of Unix from Bell Labs, and he saw it as a good platform for his own work on a Pascal compiler and text editor.

Modifications and extensions that he and others at Berkeley made resulted in the second major branch of Unix, called Berkeley Software Distribution (BSD) Unix. In March 1978, Joy sent out copies of 1BSD priced at $50.

So by 1980, there were two major lines of Unix -- one from Berkeley and one from AT&T -- and the stage was set for what would become known as the Unix Wars. The good news was that software developers anywhere could get the Unix source code and tailor it to their needs and whims. The bad news was they did just that. Unix proliferated, and the variants diverged.

In 1982, Joy co-founded Sun Microsystems and offered a workstation, the Sun-1, running a version of BSD called SunOS. (Solaris would come about a decade later.) The following year, AT&T released the first version of Unix System V, an enormously influential operating system that would become the basis for IBM's AIX and Hewlett-Packard's HP-UX.

In the mid-'80s, users, including the federal government, complained that while Unix was in theory a single, portable operating system, in fact it was anything but. Vendors paid lip service to the complaint but worked night and day to lock in customers with custom Unix features and APIs.

In 1987, Unix System Laboratories, a part of Bell Labs at the time, began working with Sun on a system that would unify the two major Unix branches. The product of their collaboration, called Unix System V Release 4.0, became available two years later and combined features from System V Release 3, BSD, SunOS and Microsoft Corp.'s Xenix.

Other Unix vendors feared the AT&T/Sun alliance. The various parties formed competing "standards" bodies with names like X/Open; Unix International; Corporation for Open Systems; and the Open Software Foundation, which included IBM, HP, DEC and others allied against the AT&T/Sun partnership. The arguments, counterarguments and accomplishments of these groups would fill a book, but they all claimed to be taking the high road to a unified Unix while firing potshots at one another.

In an unpublished paper written in 1988 for the Defense Advanced Research Projects Agency, the noted minicomputer pioneer Gordon Bell said this of the just-formed Open Software Foundation: "OSF is a way for the Unix have-nots to get into the evolving market, while maintaining their high-margin code museums.' "

The Unix Wars failed to settle differences or set a true standard for the operating system. But in 1993, the Unix community received a wake-up call from Microsoft in the form of Windows NT, an enterprise-class, 32-bit multiprocessing operating system. The proprietary NT was aimed squarely at Unix and was intended to extend Microsoft's desktop hegemony to the data center and other places dominated by the likes of Sun servers.

Microsoft users applauded. Unix vendors panicked. The major Unix rivals united in an initiative called the Common Open Software Environment and the following year more or less laid down their arms by merging the AT&T/Sun-backed Unix International group with the Open Software Foundation. That coalition evolved into The Open Group, the certifier of Unix systems and owner of the Single Unix Specification, which is now the official definition of Unix.

As a practical matter, these developments may have "standardized" Unix about as much as possible, given the competitive habits of vendors. But they may have come too late to stem a flood tide called Linux, the open-source operating system that grew out of Tanenbaum's Minix.

Page Break

The Future of Unix

A recent poll by Gartner Inc. suggests that the continued lack of complete portability across competing versions of Unix, as well as the cost advantage of Linux and Windows on x86 commodity processors, will prompt IT organizations to migrate away from Unix.

"The results reaffirm continued enthusiasm for Linux as a host server platform, with Windows similarly growing and Unix set for a long, but gradual, decline," says the poll report, published in February.

"Unix has had a long and lively past, and while it's not going away, it will increasingly be under pressure," says Gartner analyst George Weiss. "Linux is the strategic 'Unix' of choice." Although Linux doesn't have the long legacy of development, tuning and stress-testing that Unix has seen, it is approaching and will soon equal Unix in performance, reliability and scalability, he says.

But a recent Computerworld survey suggests that any migration away from Unix won't happen quickly. In the survey of 211 IT managers, 90% of the 130 respondents who identified themselves as Unix users said their companies were "very or extremely reliant" on Unix. Slightly more than half said that "Unix is an essential platform for us and will remain so indefinitely," and just 12% agreed with the statement "We expect to migrate away from Unix in the future." Cost savings, primarily via server consolidation, was cited as the No. 1 reason for migrating away.

Weiss says the migration to commodity x86 processors will accelerate because of the hardware cost advantages. "Horizontal, scalable architectures; clustering; cloud computing; virtualization on x86 -- when you combine all those trends, the operating system of choice is around Linux and Windows," he says.

"For example," Weiss continues, "in the recent Cisco Systems Inc. announcement for its Unified Computing architecture, you have this networking, storage, compute and memory linkage in a fabric, and you don't need Unix. You can run Linux or Windows on x86. So, Intel is winning the war on behalf of Linux over Unix."

The Open Group concedes little to Linux and calls Unix the system of choice for "the high end of features, scalability and performance for mission-critical applications." Linux, it says, tends to be the standard for smaller, less critical applications.

AT&T's Korn is among those still bullish on Unix. Korn says a strength of Unix over the years, starting in 1973 with the addition of pipes, is that it can easily be broken into pieces and distributed. That will carry Unix forward, he says: "The [pipelining] philosophy works well in cloud computing, where you build small, reusable pieces instead of one big monolithic application."

Regardless of the ultimate fate of Unix, the operating system born at Bell Labs 40 years ago has established a legacy that's likely to endure for decades more. It can claim parentage of a long list of popular software, including the Unix offerings of IBM, HP and Sun, Apple Inc.'s Mac OS X and Linux. It has also influenced systems with few direct roots in Unix, such as Microsoft's Windows NT and the IBM and Microsoft versions of DOS.

Unix enabled a number of start-ups to succeed by giving them a low-cost platform to build on. It was a core building block for the Internet and is at the heart of telecommunications systems today. It spawned a number of important architectural ideas, such as pipelining, and the Unix derivative Mach contributed enormously to scientific, distributed and multiprocessor computing.

The ACM may have said it best in its 1983 Turing Award citation in honor of Thompson and Ritchie's Unix work: "The genius of the Unix system is its framework, which enables programmers to stand on the work of others."

Anthes is a freelance writer in Arlington, Va.

Page Break

Timeline: 40 Years of Unix

1969

AT&T-owned Bell Laboratories withdraws from development of Multics, a pioneering but overly complicated time-sharing system. Some important principles in Multics were to be carried over into Unix.

Ken Thompson at Bell Labs writes the first version of an as-yet-unnamed operating system in assembly language for a DEC PDP-7 minicomputer.

1970

Thompson's operating system is named Unics, for Uniplexed Information and Computing Service, and as a pun on "emasculated Multics." (The name would later be mysteriously changed to Unix.)

1971

Unix moves to the new DEC PDP-11 minicomputer.

The first edition of the Unix Programmer's Manual, written by Thompson and Dennis Ritchie, is published.

1972

Ritchie develops the C programming language.

1973

Unix matures. The "pipe" is added to Unix; this mechanism for sharing information between two programs will influence operating systems for decades. Unix is rewritten from assembler into C.

1974

"The UNIX Timesharing System," by Ritchie and Thompson, appears in the monthly journal of the Association for Computing Machinery. The article produces the first big demand for Unix.

1976

Bell Labs programmer Mike Lesk develops UUCP (Unix-to-Unix Copy Program) for the network transfer of files, e-mail and Usenet content.

1977

Unix is ported to non-DEC hardware, including the IBM 360.

1978

Bill Joy, a graduate student at UC Berkeley, sends out copies of the first Berkeley Software Distribution (1BSD), essentially Bell Labs' Unix v6 with some add-ons. BSD becomes a rival Unix branch to AT&T's Unix; its variants and eventual descendents include FreeBSD, NetBSD, OpenBSD, DEC Ultrix, SunOS, NeXTstep/OpenStep and Mac OS X.

1980

4BSD, with DARPA sponsorship, becomes the first version of Unix to incorporate TCP/IP.

1982

Bill Joy co-founds Sun Microsystems to produce the Unix-based Sun workstation.

1983

AT&T releases the first version of the influential Unix System V, which would later become the basis for IBM's AIX and Hewlett-Packard's HP-UX.

1984

X/Open Co., a European consortium of computer makers, is formed to standardize Unix in the X/Open Portability Guide.

1985

AT&T publishes the System V Interface Definition, an attempt to set a standard for how Unix works.

1986

Rick Rashid and colleagues at Carnegie Mellon University create the first version of Mach, a replacement kernel for BSD Unix.

1987

AT&T Bell Labs and Sun Microsystems announce plans to co-develop a system to unify the two major Unix branches.

Andrew Tanenbaum writes Minix, an open-source Unix clone for use in computer science classrooms.

1988

The "Unix Wars" are under way. In response to the AT&T/Sun partnership, rival Unix vendors including DEC, HP and IBM form the Open Software Foundation (OSF) to develop open Unix standards. AT&T and its partners then form their own standards group, Unix International.

The IEEE publishes Posix (Portable Operating System Interface for Unix), a set of standards for Unix interfaces.

1989

Unix System Labs, an AT&T Bell Labs subsidiary, releases System V Release 4 (SVR4), its collaboration with Sun that unifies System V, BSD, SunOS and Xenix.

1990

The OSF releases its SVR4 competitor, OSF/1, which is based on Mach and BSD.

1991

Sun announces Solaris, an operating system based on SVR4.

Linus Torvalds writes Linux, an open-source OS kernel inspired by Minix.

1992

The Linux kernel is combined with GNU to create the free GNU/Linux operating system, which many refer to as simply "Linux."

1993

AT&T sells its subsidiary Unix System Laboratories and all Unix rights to Novell. Later that year, Novell transfers the Unix trademark to the X/Open group.

Microsoft introduces Windows NT, a powerful, 32-bit multiprocessor operating system. Fear of NT spurs true Unix-standardization efforts.

1996

X/Open merges with the OSF to form The Open Group.

1999

Thompson and Ritchie receive the National Medal of Technology from President Clinton.

2002

The Open Group announces Version 3 of the Single Unix Specification.