Originally Published: Tuesday, 12 September 2000 Author: Rob Bos
Published to: featured_articles/Featured Articles Page: 1/1 - [Std View]

The Time Machine: Linux Scales

One of the many buzzwords surrounding Linux, as its adoption into the small server arena becomes quickly dominant, is the question of how well it "scales" -- that is, the question of how well it can handle multiple thousands of users as opposed to multiple scores, or several hundred. People need to know if Linux can handle systems that have to deal with mail in huge volumes, and do it reliably, and so on. This idea of scaling, while important, is not as important as a more fundamental aspect of computing -- the ability of an operating system to scale with the user, not with the hardware or uses that it has.

One of the many buzzwords surrounding Linux, as its adoption into the small server arena becomes quickly dominant, is the question of how well it "scales" -- that is, the question of how well it can handle multiple thousands of users as opposed to multiple scores, or several hundred. People need to know if Linux can handle systems that have to deal with mail in huge volumes, and do it reliably, and so on. This idea of scaling, while important, is not as important as a more fundamental aspect of computing -- the ability of an operating system to scale with the user, not with the hardware or uses that it has.

Linux scales in the classic sense to very many different types of hardware: from tiny little embedded devices dedicated for one purpose and one purpose only, to Beowulf clusters on thousands of nodes, to even the relatively mundane Intel or PowerPC machine on your desk. It can't run mainframes all by itself -- yet -- and it can't fill every need in an every imaginable office context -- yet -- but it certainly has scaling in that very specific sense going for it.

One of the primary unstated goals behind Linux is to produce an operating system that scales with equal efficiency from the home user to the professional -- to create an operating system that lets you, the user, and you, the developer, and you, the administrator, work on a common platform that makes it possible to learn only as much as you need to to do your job, or as far as curiosity can take you.

It will be possible for the cliche'd grandmother to pop a CD in her drive, answer a few questions, wait a bit, and then for the rest of her computing career stick with the result, merrily sending and receiving email, accessing Web pages, maybe writing an occasional document. All of this would be happening in a secure, stable, usable system that won't pop up with inexplicable errors or deteriorate with time, the way that most "easy" computers today will tend to do. Yet on the same computer, running Linux, it would be possible for her system administrator grandson to do SQL development, or for her husband to write Perl applications, or whatever -- all with the knowledge that none of the other's activities will affect the other's files in complete security and with minimal modification.

One of the ideals behind Linux is to create, in short, a system so modular, so well configured, that it in one of its forms can fill virtually any niche. As a user, a person can stick with the few tasks that they need in a rock-solid environment in security that they don't really need to know much more than logging in and clicking a few buttons. Or a person can delve under the hood and learn just that little bit extra and eke out the rewards thereof.

Other operating environments strive for different goals. The Windows family, for instance, by attempting to cater to the lowest common denominator of user, sacrifices stability, security and long-term usability at the altar of short-term usability and profits. Other environments might strive for total security, sacrificing usability and convenience for the ability to have completely secure data or operations. Still other systems might strive for complete, utter dependability; that is, they must never, ever break, and instead be completely, totally reliable under most conceivable circumstances -- again, we have to sacrifice usability, security, convenience. These are often laudable goals; the environments created are quite often very well suited to their task and do a reasonably good job at it.

Linux, and free software in general, can be said to strive, as a whole, for a rather different purpose. Security, stability, and being idiotproof are goals that are quite important, but not as important as the simple goal of Getting Stuff Done. The people making Linux and its surrounding programs are relatively unmotivated by profit, only secondarily concerned with security (though certainly a healthy degree of paranoia is involved), and concerned with stability only insofar as it affects the real goal of Linux: usability, plain and simple. Getting stuff done in the most efficient way possible with the least effort is the priority here.

So it should not come as any surprise to assert that in the long run, Linux, with it anarchistic development paradigm, could easily in the long run create software with far superior usability and utility than any other on the market. Anyone who's become intimately familiar with Linux' workings is constantly impressed with its utility, its versatility and power.

A person can learn as much or as little as they like about Linux: starting with the friendly GUI, getting acquainted with the shell, then delving deeper and deeper into their system -- there's so much to explore. They don't have to, though; it's well designed enough that sensibly set up, it will run indefinitely. This is the future of personal computing -- to let specialised OS' have their ultra-secure, or ultra-stable, or ultra-idiotproof niches. Linux will have the rest through sheer usefulness.

Another nice thing about learning Linux, if I may digress, is that the concepts and skills learned are effectively guaranteed to be useful on any computing platform, indefinitely. UNIX has a solid, long history; it's been around for thirty years and will be around for thirty more. It's just too good an idea. It's survived every one of its naysayers and become something that its creators never thought it could become. This security in knowing that the skills you learn today will be valid years from now is something that few other operating systems have. Can you imagine someone, in being intimately familiar with DOS and all its quirks, immediately being able to use and tweak NT? Yet a BSD user of the early 80s could easily sit down in front of a modern BSD or Linux distribution and immediately feel at home. This is part of what makes Linux nifty for me -- this long-term investment of learning. Effort is rewarded not only with efficiency, but with a virtual guarantee that those skills will not be obsoleted next year by an arbitrary API change or a new OS release.

I may be accused of spreading FUD here, but can you really imagine your skills using Windows NT or Windows 95 will be useful in 30 years? It's a complete waste of time, a point that many people have had driven home to them by any of a hundred tiny incidents in learning to use Unix. To paraphrase a friend, "all of a sudden, computers make sense!" Many operating environments obfuscate, abstract, and delude to the point of unusability. Linux tries to avoid that as much as possible, sacrificing a short-term learning adjustment in exchange for the ability to use powerful software the way it was meant to be used, without anything holding your hand and telling you how to do things. Linux is the "Real World" of computing, abstracting where abstraction is helpful, and simplifying only where power can be preserved. In the long term, this strategy will pay off -- we'll end up with a powerful core and decent interfaces, with only a minimum of compromise and maximal efficiency.

World domination, fast. Join us now and share the software. These sound like trite, self-referential phrases, but sometimes they seem to hold a whole lot of truth.