Commoditization

We were going to change the world. I really felt that. Today we’re creating jobs, benefiting customers. I talk in terms of customer benefits, adding value. Back then, it was like pioneering.

–Gordon Eubanks, software pioneer

By the end of the 1980s, the personal-computer industry was big business, making billionaires and creating tremors in the stock market.

A Snapshot

On October 17, 1989, the 7.1-magnitude Loma Prieta earthquake that hit the San Francisco Bay Area also rocked Silicon Valley. When systems came back online, this was the state of the industry:

There was a renewed rivalry between the sixers, users of microprocessors from the Motorola/MOS Technology camp, and the eighters, users of Intel microprocessors. Intel had released several generations of processors that upgraded the venerable 8088 in the original IBM PC, and IBM and the clone makers had rolled out newer, more powerful computers based on them. Motorola, in the meantime, had come out with newer versions of the 68000 chip it had released a decade earlier. This 68000 was a marvel, and the chief reason why the Macintosh could do processor-intensive things such as display dark letters on a white background and maintain multiple overlapping windows on the screen without grinding to a halt. Intel’s 80386 and Motorola’s 68030 were the chips that most new computers were using, although Intel had recently introduced its 80486 and Motorola was about to release its 68040. The two lines of processors battled for the lead in capability.

Intel, though, held the lead in sales quite comfortably. Its microprocessors powered most of the IBM computers and clones, whereas Motorola had one primary customer for its processors—Apple. (Atari and Commodore, with their 680x0-based ST and Amiga computers, had to wait in line for chips behind Apple, foreshadowing Apple supply-chain strategies that would be solidified under Tim Cook as COO.)

In 1989 “Moore’s law,” Intel cofounder Gordon Moore’s two-decade-old formulation that memory-chip capacity would double every 18 months, was proving to still roughly predict growth in many key aspects of the technology, including memory capacity and processor speed. The industry seemed to be on an exponential-growth path, just as Moore had predicted.

The best-selling software package in 1989 was Lotus 1-2-3; its sales were ahead of WordPerfect’s, the leading word processor, and MS-DOS. The top 10 best-selling personal computers were all various models of IBM, Apple Macintosh, and Compaq machines. Compaq was no mere clone maker; it was innovating, pushing beyond IBM in many areas. It introduced a book-sized IBM-compatible computer in 1989 that changed the definition of portability. Compaq also introduced a new, open, nonproprietary bus design called EISA, which was accepted by the industry, demonstrating the strength of Compaq’s leadership position. IBM had unsuccessfully attempted to introduce a new, proprietary bus called MicroChannel two years earlier. IBM was fast losing control of the market, and it was losing something else: money.

By the end of 1989 IBM would announce a plan to cut 10,000 employees from its payroll, and within another year Compaq and Dell would each be taking more profits out of the personal-computer market than IBM. In another three years, IBM would cut 35,000 employees and suffer the biggest one-year loss of any company in history.

ComputerLand’s dominance of the early computer retail scene was short-lived. During ComputerLand’s heyday, consumers wanting to buy a particular brand had to visit one of the major franchises and distribution was restricted to a few chosen chains, the largest of which was ComputerLand. But in the late 1980s, the market changed. Price consciousness took precedence over brand name, and manufacturers had to sell through any and all potential distributors. The cost of running a chain store such as ComputerLand was higher than competitive operations, which could now sell hardware and software for less.

Another line of computer stores, called Businessland, gained a foothold and became the nation’s leading computer dealer in the late 1980s by concentrating on the corporate market and promising sophisticated training and service agreements. But consumers were more comfortable with computers and no longer willing to pay a premium for hand-holding. Electronics superstores such as CompUSA, Best Buy, and Fry’s, which offered a wide range of products and brands at the lowest possible prices, eclipsed both ComputerLand and Businessland. Computers were becoming commodities, and low prices mattered most.

Bill Gates and Paul Allen were billionaires by 1989; Gates was the richest executive in computer industry. In the industry, only Ross Perot and the cofounders of another high-tech firm, Hewlett-Packard, had reached billionaire status, but most of the leaders of the industry had net worths in the tens of millions, including Compaq’s Rod Canion and Dell Computers’ Michael Dell. In 1989, Computer Reseller News named Canion the second-most-influential executive in the industry, deferentially placing him behind IBM’s John Akers. But perspective matters: in the same year, Personal Computing asked its readers to pick the most influential people in computing from a list that included Bill Gates, Steve Jobs, Steve Wozniak, Adam Osborne, and the historical Charles Babbage. Only billionaire Bill made everyone’s list.

There was a lot of money being made, and that meant lawsuits. Like much of American society, the computer industry was becoming increasingly litigious. In 1988, Apple sued Microsoft over Windows 2.01, and extended the suit in 1991 after Microsoft released Windows 3.0. Meanwhile Xerox sued Apple, claiming that the graphical user interface was really its invention, which Apple had misappropriated. Xerox lost, and so, eventually, did Apple in the Microsoft suit, although it was able to pressure Digital Research into changing its GEM graphical user interface cosmetically, making it look less Mac-like.

GUIs weren’t the only contentious issue. A number of lawsuits over the “look and feel” of spreadsheets were bitterly fought at great expense to all and questionable benefit to anyone. The inventors of VisiCalc fought it out in court with their distributor, Personal Software. Lotus sued Adam Osborne’s software company, Paperback Software, as well as Silicon Graphics, Mosaic, and Borland, over the order of commands in a menu. Lotus prevailed over all but Borland, where the facts of the case were the most complex, but the Borland suit dragged on until after Borland had sold the program in question.

Borland was also involved in two noisy lawsuits over personnel. Microsoft sued Borland when one of its key employees, Rob Dickerson, went to Borland with a lot of Microsoft secrets in his head. Borland didn’t sue in return when its key employee, Brad Silverberg, defected to Microsoft, but it did when Gene Wang left for Symantec. After Wang left, Borland executives found email in its system between Wang and Symantec CEO Gordon Eubanks—email that they claimed contained company secrets. Borland brought criminal charges, threatening not just financial pain but also jail time for Wang and Eubanks. The charges were eventually dismissed.

Through essentially the whole of the 1980s, Intel and semiconductor competitor Advanced Micro Devices (AMD) were in litigation over what technology Intel had licensed to AMD.

Meanwhile, in the lucrative video-game industry, everyone seemed to be suing everyone else. Macronix, Atari, and Samsung sued Nintendo; Nintendo sued Samsung; Atari sued Sega; and Sega sued Accolade.

By 1989 the pattern was clear, and it persisted into the next decade—personal computers were becoming commodities, increasingly powerful but essentially identical. They became obsolete every three years by advances in semiconductor technology and software, where innovation proceeded unchecked. Personal computers were becoming accepted and spreading throughout society; the personal-computer industry had become big business, with ceaseless litigation and the focused attention of Wall Street; and this technology, pioneered in garages and on kitchen tables, was driving the strongest, most sustained economic growth in memory.

During the 1990s, Moore’s law and its corollaries continued to describe the growth of the industry. IBM had become just one of the players in what originally had been called the “IBM-compatible market,” then was called the “clone market,” and later was called just the “PC market.”

Within two decades the personal-computer market launched in 1975 with the Popular Electronics cover story on the Altair surpassed the combined market for mainframes and minicomputers. As if to underscore this, in the late 1990s Compaq bought Digital Equipment Corporation, the company that had created the minicomputer market. Those still working on mainframe computers demanded Lotus 1-2-3 and other personal-computer software for these big machines. Personal computers had ceased to be a niche in the computer industry. They had become the mainstream.

Sun

As the center of the computing universe shifted toward PCs, other computer-industry sectors suffered. In particular, it was becoming a tough haul for the traditional minicomputer companies. Forbes magazine proclaimed, “1989 may be remembered as the beginning of the end of the minicomputer. [M]inicomputer makers Wang Laboratories, Data General, and Prime Computer incurred staggering losses.”

However, minicomputers were being squeezed out not by mainstream PCs but rather by their close cousins, workstations. These workstations were, in effect, the new top of the line in personal computers, equipped with one or more powerful, possibly custom-designed processors, running the Unix minicomputer operating system developed at AT&T’s Bell Labs, and targeted at scientists and engineers, software and chip designers, graphic artists, movie makers, and others needing high performance. Although they sold in much smaller quantities than ordinary personal computers, they sold at significantly higher prices.

The Apollo, which used a Motorola 68000 chip, had been the first such workstation in the early 1980s, but by 1989 the most successful of the workstation manufacturers was Sun Microsystems, one of whose founders, Bill Joy, had been much involved in developing and popularizing the Unix operating system.

images/images-by-chapter/chapter-9/Sun.jpg

Figure 91. Sun Microsystems founders From left to right: Vinod Khosla, Bill Joy, Andreas “Andy” Bechtolsheim, and Scott McNealy (Courtesy of Sun Microsystems)

Riding the general PC-industry wave, Sun had gone public in 1986, exceeded $1 billion in sales in six years, and became a Fortune 500 company by 1992. In the process, it displaced minicomputers and mainframes, and made workstation an everyday term in the business world.

But Sun now cast its eye on the general PC market, at the same time that Microsoft was taking steps to threaten Sun on its own turf.

Gates’s firm had a new operating system called Windows NT, which was intended to give business PCs all the power of workstations. McNealy decided to wage not only a technical war but also a public-relations war. In public talks and interviews, he ridiculed Microsoft and its products. Along with Oracle CEO Larry Ellison, he tried to promote a new kind of device, called a network computer, which would get its information and instructions from servers on the Internet. This device did not immediately catch on.

But Sun had a hidden advantage in the consumer market—its early, foursquare advocacy of networks. People were repeating its slogan, “the network is the computer,” and it seemed prescient as the Internet emerged.

Sun was a magnet for talented programmers who enjoyed the smart, free-spirited atmosphere of the Silicon Valley firm. In 1991, McNealy gave one of its star programmers, James Gosling, carte blanche to create a new programming language. Gosling realized that almost all home-electronics products were now computerized. But a different remote device controlled each, and few worked in the same way. The user grappled with a handful of remotes. Gosling sought to reduce it to one. Patrick Naughton and Mike Sheridan joined him, and they soon designed an innovative handheld device that let people control electronics products by touching a screen instead of pressing a keyboard or buttons.

The project, code-named Green, continued to evolve as the Internet and World Wide Web began their spectacular bloom. But more than the features evolved; the whole purpose of the product changed. The team focused on allowing programs in the new language to run on many platforms with diverse central processors. They devised a technical Esperanto, universally and instantly understood by many types of hardware. With the Web, this capacity became a bonanza.

Although the project took several years to reach market, Sun used the cross-platform programming concepts from Green, which became known as Java, to outmaneuver its competition. Sun promoted Java as “a new way of computing, based on the power of networks.” Many programmers began to use Java to create the early, innovative, interactive programs that became part of the appeal of websites, such as animated characters and interactive puzzles.

Java was the first major programming language to have been written with the Web in mind. It had built-in security features that would prove crucial for protecting computers from invasion now that this electronic doorway—the web connection—had opened them up to the world. It could be used to write programs that didn’t require the programmer to know what operating system the user was running, which was typically the case for applications running over the Web.

Java surprised the industry, and especially Microsoft. The software titan was slow to grasp the importance of the Internet, and opened the door for others to get ahead. But once in the fray, Gates made the Internet a top priority.

At the same time, Gates was initially skeptical about Java. But as the language caught on, he licensed it from Sun, purchased a company called Dimension × that possessed Java expertise, and assigned hundreds of programmers to develop Java products. Microsoft tried to slip around its licensing agreement by adding to its version of Java capabilities that would work only on Microsoft operating systems. Sun brought suit. Gates saw Sun and its new language as a serious threat. Why, if Java was a programming language, and not an operating system? Because the ability to write platform-independent programs significantly advanced the possibility for a browser to supplant the operating system. It didn’t matter if you had a Sun workstation, a PC, a Macintosh, or what have you; you could run a Java program through your browser.

And Sun was serious about pursuing its “the network is the computer” mantra to challenge Microsoft’s hegemony. In 1998, Sun agreed to work with Oracle on a line of network server computers that would use Sun’s Solaris operating system and Oracle’s database so that desktop-computer users could scuttle Windows. Moreover, Sun began to sell an extension to Java, called Jini, which let people connect a variety of home-electronics devices over a network. Bill Joy called Jini “the first software architecture designed for the age of the network.” Although Jini didn’t take the world by storm, something like Sun’s notion of network computing and remotely connected devices would re-emerge in the post-PC era in cloud computing and mobile devices.

The NeXT Thing

As Apple Computer struggled to survive in a Microsoft Windows—dominated market, Steve Jobs was learning to live without Apple. When he left, he gathered together some key Apple employees and started a new company.

That company was NeXT Inc., and its purpose was to produce a new computer with the most technically sophisticated, intuitive user interface based on windows, icons, and menus, equipped with a mouse, running on the Motorola 68000 family of processors. In other words, its purpose was to show them all—to show Apple and the world how it should be done. To show everyone that Steve was right.

NeXT and Steve Jobs were quiet for three years while the NeXT machine was being developed. Then, at a gala event at the beautiful Davies Symphony Hall in San Francisco, Steve took the stage, dressed all in black, and demonstrated what his team had been working on all those years. It was a striking, elegant black cube, 12 inches on a side. It featured state-of-the art hardware and a user interface that was, in some ways, more Mac-like than the Mac. It came packaged with all the necessary software and the complete works of Shakespeare on disc, and it sold for less than the top-of-the-line Mac. It played music for the audience and talked to them. It was a dazzling performance, by the machine and by the man.

Technologically, the NeXT system did show the world. While the Mac had done an excellent job of implementing the graphical user interface that Steve had seen at PARC, the NeXT machine implemented much deeper PARC technologies. Its operating system, built on the Mach Unix kernel from Carnegie-Mellon, made it possible for NeXT engineers to create an extremely powerful development environment called NeXTSTEP for corporate custom software development. NeXTSTEP was regarded by many as the best development environment that had ever been seen on a computer.

Jobs had put a lot of his own money into NeXT, and he got others to invest, too. Canon made a significant investment, as did computer executive and occasional presidential candidate Ross Perot. In April 1989, Inc magazine selected Steve Jobs as its “entrepreneur of the decade” for his achievements in bringing the Apple II and Macintosh computers to market, and for the promise of NeXT.

NeXT targeted higher education as its first market, because Jobs realized that the machines and software that graduate students use are the machines that they will ask the boss to buy them when they leave school. NeXT made some tentative inroads into this target market. It made sense to academics to buy machines for which graduate students, academia’s free labor force, could write the software. NeXTSTEP meant that you could buy the machine and not have to buy a ton of application software. Good for academic budgets, but not so good for building a strong base of third-party software suppliers.

The company had some success in this small market, and a few significant wins. But after its proverbial “15 minutes’ worth,” the black box was ultimately a commercial failure. In 1993, NeXT finally acknowledged the obvious and killed off its hardware line, transforming itself into a software company. It immediately ported NeXTSTEP to other hardware, starting with Intel’s. By this time, all five of the Apple employees that Jobs had brought along to NeXT had left. Ross Perot resigned from the board, saying it was the biggest mistake he’d ever made.

The reception given to the NeXT software was initially heartening. Even conservative chief information officers who perpetually worry about installed bases and vendor financial statements were announcing their intention to buy NeXTSTEP. It got top ratings from reviewers, and custom developers were citing spectacular reductions in development time from using NeXTSTEP, which ran “like a Swiss watch,” according to one software reviewer.

But for all the praise, NeXTSTEP did not take the world by storm. Not having to produce the hardware its software ran on made NeXT’s balance sheet look less depressing, but NeXTSTEP was really no more of a success than the NeXT hardware. While custom development may have been made easy, commercial applications of the “killer app” kind, which could independently make a company’s fortune, didn’t materialize. NeXT struggled along, continuing to improve the operating system and serve its small, loyal customer base well, but it never broke through to a market share that could sustain the company in the long run without Jobs’s deep pockets.

The Birth of the Web

What one user of a NeXT computer did on his machine, though, changed the world.

Electronics enthusiasts in Albuquerque and Silicon Valley didn’t invent the World Wide Web, but its origin owes much to that same spirit of sharing information that fueled the first decade of the personal-computer revolution. In fact, it could be argued that the Web is the realization of that spirit in software.

The genesis of the Web goes back to the earliest days of computing, to a visionary essay by Franklin Delano Roosevelt’s science advisor, Vannevar Bush, in 1945. Bush’s essay, which envisioned information-processing technology as an extension of human intellect, inspired two of the most influential thinkers in computing, Ted Nelson and Douglas Engelbart, who each labored in his own way to articulate Bush’s sketchy vision of an interconnected world of knowledge. Key to both Engelbart’s and Nelson’s visions was the idea of a link; both saw a need to connect a word here with a document there in a way that allowed the reader to follow the link naturally and effortlessly. Nelson gave the capability its name: hypertext.

Hypertext was merely an interesting theoretical concept, glimpsed by Bush and conceptualized by Nelson and Engelbart, without a global, universal network on which to implement it. Such a network was not developed until the 1970s, at the Defense Advanced Research Projects Agency (DARPA) and at several universities. The DARPA network (DARPAnet) didn’t just link individual computers; it linked whole networks together. As the DARPAnet expanded, it came to be called the Internet, a vast global network of networks of computers. The Internet finally brought hypertext to life. And by the DARPA programmers having developed a method for passing data around the Internet, and the personal-computer revolution having put the means of accessing the Internet in the hands of ordinary people, the pieces of the puzzle were all on the table.

Tim Berners-Lee, a researcher at CERN, a high-energy research laboratory on the French-Swiss border, created the World Wide Web in 1989 by writing the first web server, a program for putting hypertext information online, and by writing the first web browser, a program for accessing that information. The information was displayed in manageable chunks called pages.

It was a fairly stunning achievement, and it impressed the relatively small circle of academics who could use it. That circle would soon expand, thanks to two young men at the University of Illinois. But the NeXT machine on which Berners-Lee had created the Web was no more.

Nevertheless, the NeXT saga was not over.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset