|[Home] [Credit Search] [Category Browser] [Staff Roll Call]||The LINUX.COM Article Archive|
|Originally Published: Wednesday, 2 August 2000||Author: Phil Hughes|
|Published to: columnists/Phil Hughes||Page: 1/1 - [Std View]|
A UNIX Revival!
UNIX should have been ready to take over the OS market space. I say should, because it didn't. I have been working with UNIX since 1980. I always expected it to take over, but it didn't. Let's look at why.
The idea of an operating system grew out of a desire to make more efficient use of computer hardware. Rather than waiting for a human to load some cards and push some buttons to start the next job, the OS could queue up that next job for you. Early forms of the OS would just read a job tape that was created on another, much less expensive, computer.
Next came systems that directly connected two computers (an IBM 7090 and IBM 7094 come to mind). One computer could deal with spooling the input and output and passing jobs to the other computer as it was available. The first computer just acted like a traffic cop. More automated yes, but still no resource allocation--the main computer just ran one job at a time.
As the OS became more sophisticated, a single computer was time-shared, allowing it to queue input jobs, print output and run one batch job at a time. In turn, as hardware became faster and more sophisticated, it became possible to perform interactive processing on one machine and thus real pre-emptive multi-tasking came of age. A good example of this was Univac's 494 message-switching system and their 1100 series of general-purpose mainframes. IBM, CDC and the other "big iron" makers developed and marketed similar solutions.
What does all this have to do with UNIX? Well, each of these vendors had to write an operating system for their hardware. An operating system was generally written in assembly language and was unique to a specific piece of hardware. See the problem? An OS not only couldn't run on hardware made by different vendors but, in many cases, it couldn't even run on different hardware from the same vendor!
Enter UNIX, which is what made all this change. While originally written in PDP-7 assembly language, it quickly became clear to the implementors that a more portable approach was in order. Enter the C language; UNIX gets rewritten in C. This didn't make it totally portable as there were still some machine-specific functions that had to be implemented (like how to deal with interrupts), but much of the code was now portable -- as computer hardware changed, it was possible to move UNIX along to the new hardware.
And the process was fairly simple: write hardware-specific routines in assembler, write a new code generator for the C compiler for the new hardware, cross-compile UNIX, debug it and then re-compile it native on the new machine. Not trivial but much easier than re-writing an entire OS for the new hardware.
At this point UNIX should have been ready to take over the OS market space. I say should, because it didn't. I have been working with UNIX since 1980. I always expected it to take over, but it didn't. Let's look at why.
The first issue was interest. Vendors such as IBM and Unisys had a huge investment in their own OS. They didn't want to abandon what they were doing in favor of UNIX. And besides the OS vendors, applications software developers had huge investments in products that only ran on IBM's OS, Univac's OS or whatever. They weren't interested in porting either.
Second was cost. First AT&T, then Novell and now SCO charge a lot of money in royalties if you use their OS. And if you want to make modifications, you need a source license which represents another big chunk of change. The big iron vendors saw their own OS as a sunk cost and couldn't justify having to buy another OS.
Finally, there was the issue of ownership. I feel there was a strong "not invented here" attitude going on in the OS groups of the hardware vendors. Whether UNIX was good or bad, it just didn't belong to the vendor.
But today things are changing. UNIX, in the form of Linux and other Open Source OSs, has made a serious comeback. So, what changed? Everything. Big iron often isn't the answer these days. What that means is that computer architecture is changing rapidly and the idea of porting an older OS from a mainframe to today's micro just doesn't make sense. So, if IBM and Unisys want to stay in the market, it is in their interest to support computers without their proprietary OS. Further, they need to be able to establish a product line on newer, faster hardware--something that doesn't give them time to port their own OS.
And the economics changed. Linux and the *BSD family are all free, including source code. This change in price point also makes it really difficult to continue to justify the "not invented here" philosophy. After all, if it's free and you can look at all the code--who cares who originally wrote it? Ideas of ownership in the traditional capitalist sense become passe.
If we look at the players who have jumped on the Open Source bandwagon (as opposed to those who built the Open Source bandwagon) we see, among others, Apple, Corel and IBM. Why them?
The new MacOS has BSD UNIX under the hood. Why? Their hardware continues to change and the UNIX code-base is very stable and easy to port. Good choice.
Corel was marginalized by Microsoft when they purchased WordPerfect from Novell. Microsoft saw WordPerfect as competition to Word and stopped giving Corel the inside information on Windows which was necessary to write killer MS-Windows applications.
IBM is the longest story. Microsoft developed PC-DOS for IBM's new PC but, with the advent of PC clones, turned it into their own product. Microsoft's initial development of OS/2 for IBM resulted in their own creation of Windows. In other words, IBM got the short end of the stick at least twice.
So, one good answer to "why them?" is that they were all marginalized by Microsoft. This isn't an issue of intent. It is simply that they had a problem with the big OS player of today, went looking for an alternative and ended up in the Open Source camp.
And this trend continues. More and more vendors are jumping ship from their own proprietary OS and moving primarily into the Linux camp. HP is a good example. SGI is another. Each company will continue to support their proprietary OS for a few years but they also know that investing in Linux today will decrease their long-term costs for OS support.
One other area where UNIX--in the form of Linux--is making a serious comeback is in what I would call "almost real-time". While the standard Linux scheduler isn't technically real-time, fast processors and reasonable estimates at the maximum scheduling latency make it possible to do many tasks that would have required a real-time OS a few years ago.
There are examples of vendors using Linux in this almost real-time environment in the September 2000 issue of Linux Journal which focuses on the embedded uses of Linux. In that magazine there are articles on the Axis video camera and the Aplio Internet telephone. Both of these systems run Linux on a proprietary processor chip to control the operation of the device. In each case it was much easier for the vendor to port Linux to the chip they wanted to use than purchase a real-time OS and then write drivers for all the other hardware. In addition, Linux is free.
More and more systems are using a POSIX-compliant Open Source OS such as Linux. Even in a company that has a policy against Open Source software (yes, there are such things), it is likely you will find a firewall, router or some other dedicated box running Linux or *BSD. Combine that with the web servers that might have a label like IBM on the outside and you can see why I say UNIX is making a comeback.