Into the Cloud

I was always clear, from like 1970, that networked computers were going to be necessary for sustainability of communities. So, then it was a matter of figuring out how to implement that. And it’s not like I suddenly woke up and said, Oh, this is important stuff. No, it was always important stuff for me.

—Lee Felsenstein

By 2000 the Internet was an integral part of life for millions of people. Indeed, 350 million or more people were online, the majority of them in the United States. That figure would grow by an order of magnitude over the next decade and a half, and more closely match the relative populations of countries. By 2014, China was home to the most Internet users.

Robert Metcalfe’s 1995 prediction that the web browser would become the operating system for the next era had largely come true. If, as Sun Microsystems had famously claimed, the network was the computer, then the browser was the operating system, and single-computer-based operating systems were irrelevant. It wasn’t quite that simple, but websites were becoming more like applications, powered by JavaScript. But even if the network was (sort of) the computer, that didn’t mean the actual computer could be turned into a glorified terminal, as Larry Ellison had discovered back in 1996 (when Oracle and Netscape were unsuccessful with what they called a network computer, or NC). A new model of computing was emerging, but the NC model wasn’t it.

In the 21st century, that new model revealed itself in the ubiquity of e-commerce and social media, and it grew on top of new algorithms and technologies designed to process data in new ways. Beneath all that was the data—data being collected and stored and processed on a scale far beyond anything human beings had ever experienced.

E-commerce as presently known didn’t happen until 1991, when the National Science Foundation lifted its restrictions on commercial use of the Internet. But with the launch of Amazon and eBay in 1995 and PayPal in 1998, online shopping was soon threatening the viability of brick-and-mortar stores.

The easiest product to sell online was software. Gone were the shrink-wrapped boxes on shelves in computer stores. And with the dominance of portable devices, the programs got smaller. Even the name got smaller: the biggest category of computer software used to be called application software, but now it was just apps. The average price dropped even more, from tens, hundreds, or even thousands of dollars to a couple of bucks.

What made it possible to write and sell two-dollar apps was the shift of processing to the Internet. Mobile apps often functioned simply as interfaces to web-based applications. And in that online market there were huge opportunities, with venture capitalists throwing money at every new instance of Gates’s bright young hacker. And while e-commerce was booming, the biggest opportunity to make your first billion was in the emerging market of social media.

Since the earliest days of personal computing, the online element of it had been at least in part about community. This online community, which Ted Nelson had called “the future intellectual home of mankind” in Computer Lib, Lee Felsenstein was working to build through projects like Community Memory. Later, online systems such as CompuServe and AOL succeeded because they provided that sense of a community. With its eWorld, Apple went so far as to build an online service around a cartoon town.

The World Wide Web put an end to these isolated services and opened up the world, but it was not a single community. New web-based services emerged, like MySpace and Friendster, each acutely aware of the huge opportunity there was for those who could successfully create that sense of community.

The most successful of these was Facebook, started by a bright young hacker at Harvard in 2004. Within ten years its community would have more members than there were people on the Internet in 2000.

A spate of new companies started by recent grads with major backing were all defining themselves against Facebook, with many of them successfully identifying communities of professionals, photographers, videographers, and devotees of various crafts. These overnight successes did not produce products in the sense that the iPod was a product. But just as Apple succeeded by designing products that people would want, these social-media companies succeeded by understanding key facts about how people connected. Their sites were highly personal.

The booming e-commerce and social-media sites had a serious problem, though: they made demands on storage and processing that existing hardware and software couldn’t deliver. In 2014 Amazon required some half a million servers to process its orders. Facebook had about half that many. The services had grown far beyond the point where a single computer of any size could do the job. But connecting any two individuals in a massive network or keeping inventory current across so many orders required new tools. The established ones simply did not scale. The work had to be distributed. New programming tools came on board, better suited to the parallelization of work to allocate it among many widely distributed processors.

Big as Facebook and Amazon were, their needs were dwarfed by those of the biggest dot-com company. This was neither an e-commerce site nor a social-media site; it was a search engine. The Web was huge, and just searching it was an even bigger job than the task facing an Amazon or a Facebook. Many search engines had been developed, but one was dominant in the 21st century: Google. Its processing and storage needs were massive, requiring huge warehouses with hundreds of thousands of servers at each one of these “server farms.”

A new term began to gain popularity: cloud computing. It was an old idea, but the technology to make it work was now in place. The distributed computing technology worked. It scaled. And soon you didn’t have to be an Amazon or a Facebook or a Google to benefit from these innovations. Google, HP, IBM, and Amazon turned this in-house technology into a marketable service. They made it possible for any business to distribute processing and storage with maximum efficiency. Amazon Web Services, or AWS, began renting Amazon’s distributed computing capability. You could rent computers, storage, the processing power of computers—virtually any definable capability of computers—on an as-needed basis. If your business grew rapidly and you needed ten times the capability tomorrow as today, you just bought more.

By 2014, seven out of eight new applications were being written for the cloud.

Meanwhile, the increases in computers’ storage capacity were continuing to redefine what computer technology was capable of. “Big data” is the term applied to the new collections of data emerging from the data-gathering of search engines—from online transactions with humans and from Internet-connected sensors gathering data like temperatures—collections so large that no traditional database tool could process them.

Computing had changed. People were now interacting with a growing variety of devices, only a fraction of which were personal computers as traditionally understood—devices that more often than not simply mediated interaction with an e-commerce or social-media site elsewhere, which in turn stored its data and executed its code in a virtual cloud, the physical location of which was not only shifting and hard to pin down but, in practice, unimportant. The model of a personal computer on your desk or lap had become quaint. The personal-computer era was giving way to something else: a post-PC era.

Paralleling this shift away from the personal computer has been a general changing of the guard among the players.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset