fbpx

Linux PenguinJapan’s largest national research organization has ordered an IBM eServer Linux supercomputer that will deliver more than 11 trillion calculations per second, making it the world’s most powerful Linux-based supercomputer. It will join other Linux supercomputers on the Top 500 List.The new supercomputer is planned to be integrated with other non-Linux systems to form a massive, distributed computing grid – enabling collaboration between corporations, academia and government to support various research including grid technologies, life sciences, bio-informatics and nanotechnology. The system, with a total of 2,636 processors, will include 1,058 IBM eServer 325 systems. The powerful new supercomputer will help Japan’s National Institute of Advanced Industrial Science and Technology (AIST), well known worldwide for its leading research in grid computing technologies. This situation is yet one more example of the rise of the Penguin.

Information technology is still much too hard — hard to create and hard to use. One factor that may play a big role in making things easier is Linux. It is hard to miss mention of Linux in the media – it has gotten a lot of attention and for good reason. It has the potential to radically impact how information technology gets created and used. A student in Finland named Linus Torvalds created Linux in August 1991. His goal was to build a Unix-like operating system which would work on a PC. Almost all PC’s at the time used an operating system called DOS and only larger more sophisticated computers used Unix. Unix was appealing to many students because of its sophistication – in particular its networking capability. What started as a hobby for Linus Torvalds turned out to become appealing to many more than just the students as major information technology companies including IBM, Hewlett Packard, and Compaq have put their full support behind the software with the Penguin mascot.

I’ve seen three major shifts during my three and a half decades in the information technology industry. In the early 1980s it was the introduction of the PC. In the early 1990s it was the emergence of the Internet as a serious communications network. In the late 1990s it was Linux. All three existed before those particular timeframes but those are when, from my perspective, the big shift started. Each had some things in common with the others. In all three cases smart people left their jobs at companies and universities to get involved with these new technologies — and venture capital followed them. In all three areas there was a lot of grass roots activity and the formation of a genuine community. They were not tops down initiatives of major companies or organizations. (Even the PC effort at IBM was led by an independent business unit that was a sort of skunk works.) All three areas were very standards oriented. They were either built on standards or actually created new standards. And one last thing that all three shifts had in common — some people in the information technology industry said, “Who needs it”? There is a lesson to be learned in this reaction.

In 1980, Digital Equipment Corporation had a number of "personal computer" projects underway (some may remember the Rainbow) but there was not a real commitment. Their first PC-like product was called an "applications terminal and small system". In effect the company said, “PC? Who needs it”? Although IBM introduced the first standards based PC there were those in the company that, when it came to serious computing needs, in effect said, “PC? Who needs it”?

In the mid nineties it was clear that the Internet was going to take over the world as far as a networking standard. I was at an Internet Society meeting in Prague in June 1994 and a gentleman from Chrysler Corporation gave a presentation on how his company was going to standardize on TCP/IP for all networking. I am sure there were some at Chrysler that thought this was radical and even some attendees of the Internet Society meeting thought so. At that time there were many networking standards out there – arguably many of them were superior to the Internet standards. But, it didn’t matter. The shift was underway and the Internet standards are now used by virtually all companies in the world (often coexisting and interoperating with other prior networking standards). Meanwhile, a number of companies that owned those other standards said in effect, “TCP/IP? Who needs it”?

And then came Linux. 1999 was the year that Linux began to look serious in spite of a number of shortcomings in scaleability, reliability, security, and manageability. Sun Microsystems in effect said, “Who needs Linux, we have Solaris and it is better than Linux. Microsoft in effect said, “Who needs Linux, we have Windows 2000 and it is better than Linux”. Along came IBM, which had questioned the need for both PC’s and TCP/IP for serious business computing, and said in effect, “Everybody needs Linux”! Perhaps it goes to show that only the greatest sinners know how to repent!

The real power of Linux is not derived from IBM or any other company or organization; it is the power of the Linux community. Linux, just like the PC and the Internet, is built in an open fashion so that all can see how it works. The communities that emerged to support them added value to what was developed by the grass roots efforts and then a whole industry grew up around them. That may be what some of the companies that did not embrace those shifts in the early days did not realize. They thought it was all about comparing whether the PC, the Internet and Linux were better than the proprietary approaches. In the early days, PC’s were much inferior to minicomputers and mainframes; the Internet was much inferior to IBM’s Systems Network Architecture or Digital’s DecNet; and in the early days of Linux it was inferior in many ways to Windows and Solaris.

But it doesn’t matter for two key reasons. First, when a major organization has a choice between proprietary offerings, or offerings built around communities, communities will almost always win. You could say it is the power of democracy. The second reason that proprietary offerings ultimately lose out is that there is no way that a single vendor can compete against a well-organized community. In the early stages, when the community is not yet well organized, it cannot make progress, and individual vendors can step in and do very well, even establishing natural monopolies as they bring order to chaos. In fact, people have argued that this is the only model that works in information technology, namely the economy of scale and setting of de facto standards that always results in monopolies. But, once the community gets organized, and can start making progress, the game is over. Darwinian evolution takes over; the best ideas survive and the weak ones fall by the wayside. There is just no way a single vendor, no matter how powerful, can have access to as talented and as many skills as the community can bring to the effort all over the world.

If you still have doubts about how real Linux may be, there are two tests that are easy to apply. First is to visit a bookstore or the web and see what is available about Linux. Amazon now has more than 1,000 Linux books. The second test is to visit the campus of any college or university that teaches computer science and ask the students. You will find that they virtually all know about Linux and are comfortable using it. There is a myth that Linux and other open source software is a cult; that it is 90% about culture, 10% serious. It is just the opposite; it is 90% discipline and high quality, 10% culture. Developers who make high quality contributions to the community rise in the unofficial hierarchy; those that contribute poor quality get sent to “programmer’s Hell” never to be heard from again. Very high quality software is produced as a result of this self-managing process. That’s why people are interested in Linux – it is a community.

Linux has become a “movement”. Anyone can contribute to it. This might include a large software sub-system contributed by IBM or some software to enable a new gadget that a student in Eastern Europe contributed. A system administrator at XYZ Company may be looking for a certain kind of software and makes the need known on the Internet. Meanwhile, someone in another part of the world had just written such software and was happy to give it away to anyone who needs it. In theory such global collaboration should’t work so well, but it does. Developers like the fact that if they find a bug in the software they can either fix it or report it to others in the community. In the end they know, since the software is open for all to see, that it will be fixed and it can be inspected. There’s a community behind it that is committed to it.

The other myth about Linux is that it is popular because it is free. There are free versions of Linux available and this makes it easy for students to learn it, but any organization that is going to use Linux for serious purposes will buy Linux from a company that specializes in distributing and supporting Linux. In addition, companies like IBM and others are enabling their own software to run on Linux platforms, and these companies will not be giving that software away for free.

The ultimate test of course is not what an information technology company or information technology user says about Linux but rather how they vote with real money and contributions of software into the Linux community. In late 1999 the 24th largest super computer in the world was installed at the University of New Mexico. It was built using 256 Intel servers from IBM, linked together in what is called a “cluster”, running Linux. In early 2001 The National Center for Supercomputing Applications (NCSA) at University of Illinois at Urbana-Champaign announced that they would be installing the largest and fastest Linux cluster in academia. Their two IBM Linux clusters are able to perform two trillion operations per second and will be used by researchers to study some of the most fundamental questions of science, such as the nature of gravitational waves first predicted by Albert Einstein in his Theory of Relativity. And already that is being eclipsed by Japan’s largest national research organization whose IBM eServer Linux supercomputer will deliver more than 11 trillion calculations per second.

Linux has also moved into the commercial environment – for real production applications. Shell International Exploration and Production is using one of the world’s largest Linux supercomputers to find new energy sources. Linux clusters will become more and more common in e-business applications as the demand from large numbers of users and transactions expands. Major financial services companies are replacing large numbers of Windows and Unix servers with a single mainframe computer running Linux! The mainframe has an operating system that in turn can enable tens of thousands of virtual Linux operating systems – thousands of systems all running on one computer. Linux is no longer just for students – it is firmly in the mainstream.

When we think of information appliances we may think of small things like MP3 players, personal digital assistants, and various wireless devices. There is another kind of appliance that is more significant in size and scope – server appliances. These specialized boxes do a subset of what normal information technology systems do. They provide printing services, manage large amounts of storage capacity, handle network security functions, etc. These are all things that information technology systems can do generally but, by making server appliances that do only these certain functions, the result is much greater reliability. Server appliances using Linux will have extraordinary stability and reliability and that translates into making things easier to manage.

At the other end of the spectrum, Linux is finding its way into very small computing devices. TiVo is a personal TV service that transforms your television-watching experience. It allows you to automatically record your favorite shows every time they air—without setting a timer or using videotape. Then you can control your TV watching by pausing, rewinding or instantly replay any program, anytime. TiVo is an easy to use consumer device. Under the covers is a microprocessor running Linux. Similarly, Lasonic makes a great digital audio server for playing MP3 music. It has Linux under the covers.

In Taiwan there is a flurry of activity going on in what is called “embedded Linux”. Embedded means that Linux is embedded “under the covers” so that the user doesn’t even know it is there. Taiwan has developed prominence based on manufacturing efficiency for industry standard products in the information technology industry. They now plan to duplicate that prominence by putting their own designs into products using Linux. I spoke at a Linux seminar at National Taiwan University in Taipei in June 2000. The seminar was focused on embedded Linux for all kinds of handheld and Internet attached appliances. The opening keynote speaker was very bullish about Linux. He said that existing industry standard operating systems for PCs are “big, expensive, and unreliable” but that Linux was “small, inexpensive, and reliable”. Since no one company controls Linux, it is available to all companies in the world to use, contribute to, and exploit. We can expect to see Asian countries approach this opportunity very aggressively.

The industry commitment to Linux is growing rapidly. IBM has bet its future on Linux with a large investment of money and top technical talent. HP, IBM, and NEC established an Open Source Development Lab in Portland, Oregon. This independent, non-profit center is providing the open source community a place to test enterprise-class Linux software. This will help ensure that Linux will be “hardened” and ready for serious e-business. Over time Linux will do for operating systems what the Internet did for networking and communications – make them truly open and interoperable.

A lot of effort is expended in organizations today on an activity called “porting”. Porting means moving an application from one software “platform” to another: from a mainframe to Windows, from Windows to Linux, from Unix to the Mac, from the Mac to the mainframe, etc. This activity does nothing for the user and yet it requires scarce skills to get it done. As Linux becomes more and more prevalent the porting can decline and those scarce resources can add more value for users. In addition, many information technology services companies, including IBM, see a big opportunity in making things easier for companies wanting to exploit Linux and they are opening new services practices to capitalize on it. The combination of mainstream acceptance, continued contributions of software from many organizations around the world, the widening availability of skills and services, and the high quality of Linux software will all contribute toward making the next generation of the Internet easier.

So is Linux going to replace all the other operating systems anytime soon? No, but is Linux disruptive software (in the positive sense)? There’s no question about it. In the market for server operating systems, Linux is growing the fastest and steadily gaining market share. Windows still dominates the desktop operating system market but Linux is even making some inroads there with several open source initiatives creating personal productivity applications and making the Linux desktop easier to install and use.

Linus Torvalds picked the penguin as the Linux logo. He once took a trip to Australia and was captivated by a ten-inch high penguin. Linus said it was, “love at first sight”. A few years later people were discussing what kind of logo people wanted for Linux. Many wanted a boring, commercial one. Linux decided on the penguin. “I’m much happier being associated with a fun and slightly irreverent logo than with something static and boring."