Originally Published: Wednesday, 20 October 1999 Author: Aaron Fransen
Published to: featured_articles/Featured Articles Page: 1/1 - [Std View]

Why Linux Will Hurt Microsoft

This article originated in osOpinion and is provided under the OpenContent License.

I have a problem. Actually, I have a few, but I think you'd all agree they all stem from the same root cause: I'm too honest.

Okay, that's not exactly a revelation I'm sure, but it's a problem for Microsoft, in a sense, because you see I'm a MCP, or Microsoft Certified Professional (in non-slang terms. Feel free to make up your own definition. Many have)....

This article originated in osOpinion and is provided under the OpenContent License.

I have a problem. Actually, I have a few, but I think you'd all agree they all stem from the same root cause: I'm too honest.

Okay, that's not exactly a revelation I'm sure, but it's a problem for Microsoft, in a sense, because you see I'm a MCP, or Microsoft Certified Professional (in non-slang terms. Feel free to make up your own definition. Many have).

I've been an MCP for about three years. I started learning about Windows NT because I thought NetWare was on its last legs, and I wanted our shop to be on the latest and greatest. I don't want to get into an emotional debate here, but Windows NT was a good thing. It forced Novell to wake up and put some real improvements in their product, as well as implementing a pricing model that real companies could live with. It was good from an industry standpoint. From a technical standpoint, well, it left a bit to be desired.

To Microsoft's credit, they made NT pretty easy to use. User and device management were way beyond NetWare's tools for ease of use (NetWare would later surpass NT, and has superior tools to this day) and that made it a cinch for a bunch of simpletons raised on Windows to administer servers for the first time.

There was only one problem with NT. It crashed. A lot. Microsoft would go on to improve it, add features such as a more fully functional GUI, a web server that was actually useful, drivers for just about every device under the sun and so on. But it kept on crashing.

Now, I came from IBM's System/36 (and later AS/400) world, where logging wasn't just a fact of life, it was life. If something went wrong, there was so much information written that tracing exactly what went wrong was trivial. If something went wrong. It was rare. IBM doesn't call the AS/400 bulletproof for nothing.

On those rare occasions when something did go wrong, it was easy to trace the sequence of events that caused the problem. Executives were used to getting absolute explanations as to what went wrong and what was being done to fix it. Not so with NT. Explanations frequently amounted to "uh, we think it might have been the antivirus program." We were never sure.

Microsoft continued to add features. And NT continued to crash. It didn't matter, because NT was so easy to use and Microsoft was the "underdog" so it got lots of support. Then Microsoft got greedy (some argue they were this way all along) and changed little things. Like the Office license that said, no, you can't use concurrent licensing anymore. Like the Microsoft Word diskette that would only install once, then you had to pay Microsoft another $30 if you ever had to rebuild your PC and wanted to install Word again.

A few years ago Citrix brought out a product called WinFrame that gave Windows NT 3.5x functionality similar to X Servers in the Unix world. Later Microsoft would revoke Citrix's NT source code license and introduce their own NT Terminal Server, the NT 4.0 update to WinFrame. This gave NT the same functionality Unix has had since the introduction of X. 30-year-old technology indeed. But the licensing got more interesting. For equivalent functionality to WinFrame, you had to buy (1) a Windows NT Client Access license, (2) a Terminal Server license, and (3) a Citrix ICA license. You're up to about $500 per user without having installed a single application. And let's not forget that Office doesn't support concurrent licensing, so that's another $400 to $600 for every single desk with either a PC or thin client on it.

And this is where Linux comes into the picture, because it's my belief that the thin client movement had a big part in Linux going into server rooms. The whole point of thin clients was that they were a cinch to administer compared to PC's, but a lot of businesses balked at the cost of entry. Sure, they could save money in the long term, but they wanted to know how to save money now.

Linux (and its more robust brethren, BSD) was a godsend. Free? Stable? Versatile? Too good to be true! The biggest complaint was that 14.4 modems took an awful long time to download it. Okay, maybe that was the second biggest complaint. Let's face it, until very recently, installing and administering Linux was an exercise in frustration, especially for Windows weenies like me.

Then one day I had a pressing need to install a cheap and robust firewall. Enter RedHat 5.2. I'd installed other versions of Linux, and it had taken hours upon hours upon... but this version wasn't bad. Ultimately, after reading all the HOWTOs and such, it took me about a day to configure a firewall. (Incidentally, having it installed on a slow Pentium piece-of-junk clone PC was faster than the quad Pentium-Pro IBM 704 server with warp speed fast RAID disk, running NT and Proxy of course.)

Administering was still a bitch, but it was livable.

Enter Caldera OpenLinux 2.3. It installed in less than 15 minutes. Faster than Windows NT, Windows 95 (and 98), and don't even talk about Windows 2000. Administration? Install a program like Webmin and it's all done through a web browser. Top that, Microsoft. I'm sure some of the other distributions are just as good as Caldera's, it just happens to be my personal preference.

And here we come back to thin clients. Recently there's been a lot of talk about the $200 PC. With OEM versions of Windows selling for about $50, we all know that a $200 PC won't be running it. This is why companies like Corel and Caldera are making deals to make Linux run on low-end, affordable computers.

We're at the point where CPU speed is starting to mean less and less. Oh sure, some people say that those high-speed CPU's are going to be required for things like voice recognition, et al, but let's get real. Analog data streams such as voice recognition will never be a good fit for digital CPU's, Pentium, Merced, Athlon or otherwise. That's going to take new neural-based processors. At which point your $2000 Pentium III will be useless anyway.

But I digress. The truth is Microsoft can't afford to be in the coming desktop business model. Linux can. The user interfaces have a ways to go, but they'll get there. Bob Metcalfe (yes, the guy who invented Ethernet) said in InfoWorld a few months back that the Open Source movement won't last, but he's wrong for a simple reason. Programmers love to program. (Even back in my coding days, I used my holidays to write a helpdesk app for the office. Doh!) Even more, they love to show off their wares (or is that warez?).

Finally, let's not forget about the server model. Despite Microsoft's evidence to the contrary, my personal experience has been that Linux is noticeably faster on slower hardware, never mind equivalent hardware. Has anyone else noticed how much faster login scripts run on Samba than NT? Or how much easier it is to manage user's home directories? Hell, I'd move everyone I know to Linux just for that!

So here's my problem, and Microsoft's: I'm honest enough to admit to anyone who asks that Linux is faster, easier, and crashes less.

A lot less.

Aaron Fransen is a consultant working in Calgary, Alberta who's had the (mis)fortune of working with just about everything that ever contained an integrated circuit. He can be reached for response to this column by e-mailing him at: aaron_fransen@hotmail.com.