Originally Published: Saturday, 2 October 1999 Author: John Zedlewski
Published to: featured_articles/Featured Articles Page: 1/1 - [Printable]

Free Software and the Innovator's Dilemma

This article originated in osOpinion and is provided under the OpenContent License.

If you wanted to assemble a "must read" list for any businessperson looking at the Linux/Free Software industry, what would you include? Certainly "Open Sources" from O'Reilly is the most obvious answer, probably followed by Bob Young's upcoming "Under the Radar," which details the story of Red Hat's rise. But I would argue that a third book belong in the top tier of that list as well: "The Innovator's Dilemma," written by Clayton Christensen and published by Harvard Business School Press....

   Page 1 of 1  

This article originated in osOpinion and is provided under the OpenContent License.

If you wanted to assemble a "must read" list for any businessperson looking at the Linux/Free Software industry, what would you include? Certainly "Open Sources" from O'Reilly is the most obvious answer, probably followed by Bob Young's upcoming "Under the Radar," which details the story of Red Hat's rise. But I would argue that a third book belong in the top tier of that list as well: "The Innovator's Dilemma," written by Clayton Christensen and published by Harvard Business School Press.

"The Innovator's Dilemma" traces the histories of various industries, from disk drives and microprocessors to steamships and automobiles, in which established market leaders have been beaten out by smaller, more nimble competitors. This idea, that startup firms have noticeable advantages over their larger rivals, has been one of the cornerstones of the Internet era, not to mention Microsoft's antitrust defense, but Christensen is one of the few authors to actually address the specifics of these show-downs. In almost every case, he claims, his example established industry leaders were managed well, by conventional standards. They listened to their customer base and constantly sought to increase their penetration in high-margin, high-end markets. These seemingly-innocuous strategies become disastrous, however, when a "disruptive technology" enters the low end of the market. The established firms shy away from these new technologies to avoid undercutting their core, profitable businesses, but this ultimately leaves the market open for a new player to implement the disruptive technology, then slowly march up the food chain, overthrowing the old market leader. Minicomputer manufacturers in the 1980s, for instance, diligently followed their customers' demands to invest only in faster minis, while ignoring the PC market, which held little interest for the companies' established base of scientific and business customers. How many of those companies are still alive today?

This isn't, of course, a book review. Instead, I guess you could call it my attempt at a wake-up call to those software companies (you know who you are) who still think they can make a living on "business as usual" in the next millennium. Specifically, I want to focus on Linux, which might be the best example of a truly disruptive technology that we've seen since the advent of the internet. In fact, this theory gives us a guide to understand how established software firms risk missing the boat with respect to Linux, just as brick-and-mortar retailers were overtaken on the net by smaller, more daring startups.

Consider three statements that a software vendor looking at Linux in 1999 might make:

  • "That sounds like an interesting idea, but we asked our customers and they don't seem interested."

  • "That sounds like an interesting idea, but the profit margin sounds too slim."

  • "That sounds like an interesting idea, but it would eat into our more profitable core business."

Now take those same three statements and imagine them coming from an executive from an established retail vendor considering e-commerce in 1995. It's not much of a stretch, is it? In this case, though, we're hampered by our hindsight. We fail to appreciate that the executives who turned down a chance to take, say, Barnes & Noble to the Web in 1995 were in fact making a very reasonable decision based on the traditional, financial bottom line. They would have spent millions to set up shop, made it easier for customers to price compare (and see that, in fact, the Barnes & Noble of 1995 was a fairly expensive bookstore); the customers who did buy online would need at least some discount to offset the cost of shipping; and the site, like Amazon, would have been a spectacular way to drive the parent company into the red. By not embracing the net, however, they opened the door to the B&N's greatest threat in decades, and they ultimately had to spend even more money to build play catch up against the $20 billion Internet rival that wouldn't even have been created if the existing booksellers hadn't dropped the ball.

How have today's successful technology firms, then, fallen into the "innovator's dilemma" with respect to Linux? By focusing on the short run and the high end. When companies like Sun, SCO, and Microsoft dismiss the OS, they talk about its lack of scalability, its relative newness, and its lack of a journalling file system. These features, however, are irrelevant at the workstation and workgroup server level, and, more importantly, they're all being developed at an unbelievable pace to help the OS scale to enterprise levels. As commodity hardware becomes more and more powerful, while Linux and Windows NT continue to scale up to new heights, the traditional Unix vendors will find themselves increasingly marginalized to the very highest-end of the computing spectrum, falling into what I call the "Silicon Graphics trap."

Silicon Graphics (now SGI) was notorious for their focus on the high-end, sexy technologies: Cray supercomputers, 128-processor Origin servers, $10,000+ workstations, etc. While each unit sale at this level seems highly profitable, in reality the R&D costs involved in pushing the envelope of technologies at the microprocessor, OS, server, and applications levels was simply impossible to maintain, and the company spiraled deep into unprofitability as companies like Intergraph ate away at their core graphics business with low-cost graphics workstations. Now SGI has become the first traditional Unix company to truly embrace Linux, making it their platform of choice for the IA-64 architecture. Perhaps they were lucky to fall into the trap while they still had time to ride the first wave of the Linux movement.

But that leaves a question for the rest of the OS, hardware, and computer services worlds: are you willing to concede the entire low-to-mid-range server market to smaller, faster companies that position themselves as Linux early adopters? Do you want your future customers to think of you as a Linux pioneer that can accurately evaluate and deploy the OS, or as a second wave Johnny-come-lately that doesn't really understand the phenomenon at all?

Clayton Christensen's prognosis for these established players is unequivocal: unless they form new business units with the autonomy to embrace such disruptive technologies "the probability that they will survive as industry leaders is zero" (Bloomberg Personal Finance, October 1999). In my opinion, some of these market leaders need even more fundamental changes in their structures to address this disruption; they need thorough forward-looking reorganizations, such as SGI is undergoing with Linux and open source, as Microsoft restructured their product lines to face the internet, and as I advocated that SCO should change in my last article.

John Zedlewski is a student of economics and computer science at Princeton. He can be reached for response to this column at: zedlwski@princeton.edu.





   Page 1 of 1