In order to define UNIX, it helps to look at its history. In 1969, Ken Thompson, Dennis Ritchie and others started work on what was to become UNIX on a "little-used PDP-7 in a corner" at AT&T Bell Labs. For ten years, the development of UNIX proceeded at AT&T in numbered versions. V4 (1974) was re-written in C -- a major milestone for the operating system's portability among different systems. V6 (1975) was the first to become available outside Bell Labs -- it became the basis of the first version of UNIX developed at the University of California Berkeley.
Bell Labs continued work on UNIX into the 1980s, culminating in the release of System V (as in "five," not the letter) in 1983 and System V, Release 4 (abbreviated SVR4) in 1989. Meanwhile, programmers at the University of California hacked mightily on the source code AT&T had released, leading to many a master thesis. The Berkeley Standard Distribution (BSD) became a second major variant of "UNIX." It was widely deployed in both university and corporate computing environments starting with the release of BSD 4.2 in 1984. Some of its features were incorporated into SVR4.
As the 1990s opened, AT&T's source code licensing had created a flourishing market for hundreds of UNIX variants by different manufacturers. AT&T sold its UNIX business to Novell in 1993, and Novell sold it to the Santa Cruz Operation two years later. In the meantime, the UNIX trademark had been passed to the X/Open consortium, which eventually merged to form The Open Group.1
While the stewardship of UNIX was passing from entity to entity, several long-running development efforts started bearing fruit. Traditionally, in order to get a BSD system working, you needed a source code license from AT&T. But by the early 1990s, Berkeley hackers had done so much work on BSD that most of the original AT&T source code was long gone. A succession of programmers, starting with William and Lynne Jolitz, started work on the Net distribution of BSD, leading to the release of 386BSD version 0.1 on Bastille Day, 1992. This original "free source" BSD was spun out into three major distributions, each of which has a dedicated following: NetBSD, FreeBSD, and OpenBSD, all of which are based on BSD 4.4.2
BSD wasn't the first attempt at a "free" UNIX. In 1984, programmer Richard Stallman started work on a free UNIX clone known as GNU (GNU's Not UNIX). By the early 1990s, the GNU Project had achieved several programming milestones, including the release of the GNU C library and the Bourne Again SHell (bash). The whole system was basically finished, except for one critical element: a working kernel.
Enter Linus Torvalds, a student at the University of Helsinki in Finland. Linus looked at a small UNIX system called Minix and decided he could do better. In the fall of 1991, he released the source code for a freeware kernel called "Linux" -- a combination of his first name and Minux, pronounced lynn-nucks.3 By 1994, Linus and a far-flung team of kernel hackers were able to release version 1.0 of Linux. Linus and friends had a free kernel; Stallman and friends had the rest of a free UNIX clone system: People could then put the Linux kernel together with GNU to make a complete free system. This system is known as "Linux," though Stallman prefers the appellation "GNU/Linux system."4 There are several distinct GNU/Linux distributions: some are available with commercial support from companies like Red Hat, Caldera Systems, and S.U.S.E.; others, like Debian GNU/Linux, are more closely aligned with the original free software concept.
The spread of Linux, now up to kernel version 2.2, has been a startling phenomenon. Linux runs on several different chip architectures and has been adopted or supported to varying extents by several old-line UNIX vendors like Hewlett-Packard, Silicon Graphics, and Sun Microsystems, by PC vendors like Compaq and Dell, and by major software vendors like Oracle and IBM. Perhaps the most delicious irony has been the response of Microsoft, which acknowledges the competitive threat of ubiquitous free software but seems unwilling or unable to respond with open-source software of its own.5
Microsoft has, however, struck blows with Windows NT (Windows 2000). During the late 1990s, vendor after vendor has abandoned the UNIX server platform in favor of Windows NT or wavered in their support. Silicon Graphics Inc., for example, has decided that Intel hardware and NT is the graphics platform of the future.
The phenomenon of old-line UNIX vendors jumping ship and the concurrent rush to Linux by vendors large and small brings us back to the question at the top of this section: What is UNIX? While one can abide by the legal definition as embodied in the trademark, I believe that this does a major disservice to the industry. As the base software of the Internet, UNIX technology is one the significant achievements of 20th century civilization. To restrict it to a narrow legal or technical definition -- as formulated by some of the vendors now abandoning it -- is to deny its ongoing relevance and importance, which is most evident in the amazing popularity and strength of UNIX-like clones such as GNU/Linux and BSD.
A set of enabling technologies first developed at AT&T that have been incorporated into several legally distinct but closely related operating systems, each of which can be considered to be a "UNIX system." If it looks like UNIX, operates like UNIX, runs common UNIX utilities and programs, and is developed with UNIX as a model, it's UNIX.
Copyright © 1999 by The McGraw-Hill Companies. Used with permission.
HTML Copyright © 1999 Albion.com.