|[Home] [Credit Search] [Category Browser] [Staff Roll Call]||The LINUX.COM Article Archive|
|Originally Published: Thursday, 13 January 2000||Author: Gary Lawrence Murphy|
|Published to: featured_articles/Featured Articles||Page: 1/1 - [Std View]|
Prime Time Linux
What are our prospects for Linux in the mainstream market? Are we closing in on the days of Linux in "prime time?" I happen to think it is inevitable, but before you rush out to buy more shares, I'll add a caveat that, for very real reasons that may even be natural laws, it may not happen as soon as we might hope....
This article wants to take a highlight pen to the facts as they stand, to the reasons why we cannot have our cake in Y2K, or even 2001. Then, in the tradition of New Years Pundit Reports, I'm going to stick my neck out with a prediction; although my own prediction is going to agree with that of another more famous pundit, I've arrived at mine by a different route, and there may be some intrinsic value in that. I'll let you decide.
Before anyone gets up in arms, let me clarify this article and that this advice is not for everyone. It is irrelevant to those who are working on free software projects for their own internal use. I only want to speak to those who are, like myself, in the business of serving a computing community and interested in having that community on Linux and GNU systems.
The Rocky Road to World Domination
Right now, the pundits say we are dangerously close to saturating the North American market. Fully one third of American adults say they have no need of computers and have no plans to buy one, and we are rapidly out-fitting the remainder. They say it's coming to a crunch, and for the first time in our careers, providers of computing services must deal with competition and the need to steal away customer loyalties to ensure our own survival.
Thus spake the other pundits. I don't buy it. I have been hearing comments like those from that missing third for 25 years, and every last one of them is now online. What it took was a change not in their own needs, but in the way technology serves those needs. What the pundits are measuring is not a saturation of the potential computer users, but a saturation in the population who are tolerant of badly designed systems. Like the Nike commercial with William S. Burroughs puts it "Technology should serve the body, not enslave the mind." It's in that market, in the humanizing of computers, that is the new frontier.
"Speak so I may see you!"
I owe Skip Heine for the following quote; however worded, it is true and it's endemic to our industry:
"Curious acronyms and hi-tech verbal smog"
In 1989, when my oldest boy was 6, he received a game from a friend on a floppy. I was reading the Sunday paper in another room. I'd set the computer with a BAT file that had a menu of all his programs; I told him "press Ctrl and C and that will put you at the DOS prompt, and you will know what to do from there."
Two minutes later came the call: "I'm stuck!"
"What happened?" I called back.
"I'm stuck" he said.
"What does it say?" I pressed, not wanting to budge from my chair.
"I don't know" came back.
"What do you mean you don't know? You can read can't you?"
"I can't read this!"
So I got up, went into the office and there, bold as life, sat the console prompt:
TERMINATE BATCH JOB? (Y/N)
I had to explain 30 years of the history of computers to help him see the sense of that statement.
The Babbling Appliance
No other consumer product has this problem. Auto manufacturers have no great need to label common auto parts with "curious acronyms," such as DCD (Directional Control Device) instead of a "steering wheel," just to make their product look superior. ICPP for Internal Combustion Power Plant instead of just an engine? (Thanks again to Skip.)
I put it down to Star Trek. People expect computers to have a science fiction edge to them, to put a divider between them what's in the know from them what's not. So in part, this is what the consumer expects from us, but let's be reasonable.
In my younger years, I would rail against people making useless or misleading metaphors about computers, following in the advice of my mentor Professor Dykstra "If anyone says they are 'feeding' their computer data, I will fail them." (I never met Dykstra, but his writings had a profound influence on my computer thinking.) Computers are simple; they are a glut of switches, nothing more. If we wrap them in metaphors we just introduce confusion. They are not 'Windows' but at best they are 'panels'; even the traditional mainframe language of 'screens' was more accurate. And it is certainly not a 'desktop.'
Of course, our industry doesn't want anyone to know that anyone with half a brain can program a computer. If the emperor is caught naked, the bottom falls out of the guru market. Thus I have now spent 35 hours of my client's time to make sense out of the techno-maze around directory servers (LDAP) and spent two months of my own time last summer making sense out of the emerging DocBook tech documentation standard (DSSSLs vs XSLs, five different syntax models just to print a chapter). If this happens to a seasoned professional, I can't imagine the burden of the learning curve we foist on the non-indoctrinated public.
It started honestly: When Unix was invented, they wanted to avoid the confusion of metaphors. Bell did their usual psycho-research and found people better at learning new nonsense names than at attaching (and distinguishing) new meanings for old words. They decided that all Unix commands would have new names only superficially like their metaphor counterparts. 'cp' instead of 'copy' (because it doesn't copy, it re-types a new page by reading the old page, and that can lose information if the reader skips things like control characters) and 'awk' instead of 'pattern matcher' because it does so much more.
Since computer science people are the most traditional people I know, the convention lives on. Projects I've been on have laboured long on the right string of words to make a nifty and pronounceable acronym, and then came Richard Stallman with his tradition of the recursive acronym (PINE=Pine is not Elm as a much-improved ELM, or ELectronicMail)
Machines Making Sense of Machines
The crypto-speak of which we are all guilt is fairly easy to fix. I see need for several catalogs and directories of business problems matched to those software programs or hardware components that solve them. There are a few on the Internet, but especially in the open source circles, no one has the skill or the budget to become the standard site that has the attention of the public. Back in 1994 it was easy to get your Web directory at the forefront of the search lists, but today, if all you want is the list of printers that work with Linux, you have a major investigation on your hands.
There is opportunity all over the place to implement what Bill Buxton (Alias Research) would call the "Self-revealing interface," only it goes beyond the screen manager, and it could be a guiding design principle from large Web directories all the way down to the little panels that float over a button when the pointer arrow crosses it. It's a simple requirement. We just keep asking, "Does this make sense?"
Of course, any O/S can do this, and all of them should, but with the Linux world, we have the advantage of a clean slate.
In the Old Fashioned Way
There is another aspect to the humanizing of computers which comes from psychology, or more accurately, from the "neurolinguistic programming" work of Richard Bandler and John Grinder. In humans, change does not typically occur as a catastrophic shift, but incrementally. People get into the most trouble when they resist incremental changes until their environment forces a catastrophic situation; that is how 99% end up in therapy. For the software engineer, this means quite simply that we need to use what people already know.
Like it or not, we owe the widespread use and visibility of Windows(tm) for the recent stellar success of Linux through the KDE interface. Whether or not it is technically better or even sound in a human factors design sense, the familiarity of it is our greatest hook. You cannot attract anyone without first attracting their attention, and you cannot change them unless you can attract them.
Riding the Inevitable
Years ago, at a party in an advertising company office, I'd had a few and started explaining to some of their employees my theory of advertising: that it is a Skinner Box, but it is not the public being conditioned to press the bar for a food pellet, it was the advertisers themselves. One fellow, a copy writer, told me this was the most depressing news he had heard in his entire career and headed back to the bar.
Several days later, I was in their office again and ran into the copy writer. He was really excited. "I can disprove your theory!" He said. "Advertising can make a bad product fail even faster!"
With GNU on Linux systems, we have an opportunity, but it is just that. Without being truly ready, we can well do ourselves damage by pushing too hard too soon (just like childbirth, for those who know it). Until we not only have the goods, the word processor, the browser, the database and spreadsheet, but we also have these in a form that captures the experience and expectations which people already possess, a big media splash is a dangerous thing.
Industrial Time Lag Theory
Should we persist in inventing and perfecting mass-appeal Linux systems even if we cannot 'succeed' in the market until all of us are ready? The answer, of course, is 'yes,' but with a mind to the principle of Industrial Time Lags. Fortunately, these lags are predictable, and for those planning to ramp up for a Linux service industry infrastructure, this news might save your life.
R. Buckminster Fuller, famous inventor of the geodesic dome, spent a lot of time charting the time lag between invention and mass acceptance, and he discovered there is apparently a fixed lag specific to each industry. In electronics, it is 18 months. In the building trades, it is 30 years.
Given that, he concluded an inventor should go ahead and build their better mousetrap even if the initial response is dead silence. If the invention is truly useful, mass adoption will come, but there is no way to hasten it; it comes in its own time. The important thing is that the process cannot start until the inventor announces the new invention.
My own informal observation is that the software industry is nowhere near as fast as the electronics industry. It took 4-5 years before CP/M was commonplace, 4-5 years for DOS (just in time for the Mac), 4-5 years for FSF GNU-C and Emacs, 5 years for Windows 3.1, and it has taken just under 5 years for Windows95 (it was still fringe in 1998).
Now, Linux was invented as a research tool, and by 1996, as Bob Young's survey found out, it commanded a good share of that market. Bob introduced the Red Hat distribution of Linux for the server market in 1996 and he's right on schedule with his IPO ;). Mandrake started with their winning product of the Year in 1998, so that means any infrastructure to capture the average-user market needs to be in place by 2003, and Corel's bid to capture the office desktop won't be viable until 2004.
If not now, when?
Creating Linux services for the mass market is like archery on a moving target: You need to aim ahead of the flying bird to hit it.
The upshot of all this is that we cannot expect to see Linux in the mainstream today, or even next year, at least not in the same way that we see other platforms. This is not a prophecy of doom, it is not FUD, it is just the way it is in our industry: those who push the 5-year lag are banging into a brick wall, which is ok so long as you know that this is what you are doing. Walls do fall and won't if no one bangs into them; it is only frustrating and depressing if you think that you should be going faster.
Just as the automobile did not see mass acceptance until we had gas stations and service centers everywhere, just as the PC did not enjoy mass appeal until you could get support in every town and village, and commercial web sites did not occur until every graphic design shop morphed into self-styled webmasters, so it will be with Linux. There are, as we all know, initiatives afoot to do this, from corporate 'cathedral' consultancies such as Linuxcare to 'bazaar' affiliations such as Bynari International, but let this be a call to arms for more, of all shapes and sizes, and a heads-up that it is in all our own best interests to foster and nurture these businesses to ensure our own hinterland when the inevitable happens.
To carry my hunting metaphor forth, knowing the speed of our bird makes it a lot easier to design and launch our bow and arrow, and now we have both a checklist and a target. When this inevitable mass-adoption of Linux happens in 2003-2004, to hit our goal we will at the very minimum need to develop:
Fortunately, based on historical service lags in previous computing technologies, it will take about 2-3 years to get all this in place. So if we start today...
So... what needs doing now?
Gary Lawrence Murphy is CEO of TCI, the Canadian coordinator of Bynari International, and an author, tech editor and open source consultant to Macmillan Computer Publishing. He currently lives in the wilderness forests of Sauble Beach designing and developing next-generation Internet applications for Bell Canada and the CBC.