1998: If you cut your teeth on the Mac or even a Windows machine, count yourself fortunate. A graphical operating system lets you play around and figure out how things work. It’s user-friendly, which is why the Macintosh caught on and influenced the shape of the dominant PC operating systems.
The same concepts are playing a larger role in the Unix world, with X Windows, NeXTstep, and BeOS offering graphical interfaces for the underlying operating system.
Until next year, the Macintosh OS is highly integrated from the kernel through the interface. But next year Mac OS X will integrate our familiar interface with a kernel designed around very different principles.
In the Beginning
The first computers didn’t use keyboards, punch cards, or any kind of tape. They were hard wired. Then came programmable computers and ways to save programs and run them again. Eventually, we got to things like keyboards, video displays, and disk drives.
Early computer operating systems would greet you with a blank screen. Well, nearly blank. There might be a cursor awaiting your input – a blinking cursor on some systems. But you couldn’t just sit down and use these computers. You had to know their language. You had to work to get them to work for you.
Unix was born at AT&T in 1969. In an age when most computers had proprietary operating systems, Unix was designed for portability. Written in the C programming language, all you needed to do to put C on another computer was write a C compiler and compile Unix for the new computer.
In that era, computers did many jobs for many users and cost many, many dollars. They also generated many dollars by leasing computer time, so a top concern was a robust operating system – you wanted that machine earning its keep every minute that it was running.
Unix became a leading operating system because it was designed from the ground up to track users, track system resources, track time used, and keep the computer up and running. Today you hear stories of Unix computers that never crash, only going down for preventive maintenance, system updates, or natural disasters.
Enter Personal Computers
The first personal computers didn’t have enough resources to run Unix. With only a few kilobytes (KB) of memory, there was no reason to even consider multi-user support. Early PC operating systems such as TRS-DOS, Apple DOS, and CP/M assumed one CPU, one user, and one program running at a time. If the system crashed, only one person lost data. The key was making as much functionality as possible fit into a limited amount of memory. Stability was nice but not always foremost.
But they all had the same kind of blank screen waiting for input as the earlier mainframe and minicomputers had.
That didn’t change with the introduction of the IBM PC. Sure, it could handle an unimaginable 640 KB of memory, but MP/M-86 (an early multi-user OS for the PC) was never a hit. Users had become used to one CPU, one user, and one program running at a time.
Over time the paradigm shifted, thanks to utilities that let DOS machines keep two or more programs in memory and switch on the fly. Then came operating systems for personal computers (including Windows, OS/2, and the Mac OS) that let you run more than one program at a time, even allowing the background programs to keep working (although usually with reduced performance).
One CPU, one user, many programs has given way, over the past few years, to one or more CPUs, one user, many programs. Both the Mac OS and Windows support two or more CPUs, which can be very helpful for Photoshop filters and a handful of other intensive tasks.
Full Circle
Windows and the Mac OS grew out of the single user, single task, single CPU paradigm to allow multiple tasks and multiple users. But they lost stability along the way.
It’s tiresome restarting a computer and waiting while it reloads. And it’s frustrating losing your work. Stability has become a top concern among PC users, whether we use Macintosh, Windows 98, or Windows NT.
From that perspective, Unix looks like the holy grail. Stable. Multitasking. Support for multiple CPUs. Even if we don’t need the multi-user capabilities, it looks like a much better platform to build upon than a hacked at OS with roots going back to DOS 1.0 (1981, rooted in CP/M from the 1970s) or Macintosh System 1 (1984).
Really Full Circle
I’ve done a little experimenting with MacBSD, an implementation of Unix that runs on older Macs. I find it as opaque as MS-DOS was when I first set out to learn it. There’s a blank screen with a cursor.
What do you do?
For all its power, Unix is an operating system for gurus and wizards. For most users, it will be enough to run a graphical shell that isolates us from Unix, just as Windows 98 isolates users from DOS.
From what I’ve seen of BeOS, it’s not difficult to implement a version system that hides the underlying OS behind an easy-to-use shell. For most of us, Mac OS X will look and act a lot like Mac OS 8.x – but it will keep running if our applications crash.
I’m glad Steve Jobs is willing to lead Apple forward by finding roots even deeper in the past than the Apple II. Unix will give Mac OS X the stable, extensible foundation it needs to enter the 21st century.
And Jobs should know. He went down this road with NeXT, perhaps the first consumer computer designed to run Unix with a graphical shell.
Further Reading
- Operating Systems: Past, Present, and Future, Mac Musings
keywords: #unix #macosx #macbsd