Preamble

Opening doors and expanding horizons

Modern life without technology would be unthinkable, so much so that we never even consider the possibility. Rarely do we give a moment’s thought to where all the technological marvels came from, how they materialized, and what policies enabled them. By the time you finish reading this book, you will be able to answer those questions. More important, you will be able to use your knowledge to help shape the future of the country and the world, improve your chances of success professionally, and achieve a comfort level with today’s increasingly complex society. But first, let’s take a quick tour of technology in modern life.

Let’s begin with some things that almost seem like bodily appendages—smart phones, tablets, and computers. We use them for communication, entertainment, and business. They connect us to the Internet and satisfy myriad needs: sending and receiving emails and texts; making travel arrangements; getting tickets for events; buying all manner of merchandise, from the largest appliance to the smallest token; comparison shopping to get the best deals; reading the latest news (real and, unfortunately, sometimes fake); communicating with our friends and family on social media—all of it online, all from the comfort of our home or our car, or as we walk along the street.

More often than not we use plastic—credit or debit cards—instead of cash for purchases. And if we opt for cash, we get it from an ATM instead of a bank teller. At checkout counters, we rely on laser scanners to speed us on our way and to make sure we haven’t been overcharged. (Stores use those scanners to keep track of their inventory and finances in real time.)

When we purchase a car, we expect it to spend little time in the repair shop and not break the bank guzzling gas. And when we drive, we rarely use paper maps any more, relying on GPS—the acronym for Global Positioning System—to get us to our destination quickly with few miscues. We value accident avoidance systems, and we assume airbags, seat harnesses, and the integrity of cabin construction will protect us in case an accident does occur. Finally, if we're really tuned in, we can see the day coming when autonomous vehicles will free us from many of the mundane tasks of driving.

On an airplane, we pay far more attention to the personal entertainment system than safety instructions, because technology has made accidents a rarity. Over a 10-year span, from 2004 to 2013, U.S. commercial airlines registered a scant 139 fatalities, flying a total of 78.7 billion miles during that period—an amazing average of only one death for every 556 million miles of travel.1 Technology has made that remarkable record possible with the use of radar guidance systems, computers, communication satellites, flight-training simulators, light-weight but strong materials, and durable jet engines.

When we shop for food, we expect the products to be safe, affordable, and plentiful. When we drink water, we assume it’s free of harmful contaminants, and when we learn that it isn’t, as was the case in Flint, Michigan2 in 2016, we demand that public officials be held accountable for scientific malfeasance.

If we get sick, we know that modern medicine has extraordinary diagnostic tools at its disposal, from sophisticated blood tests and ultrasound imaging to CT scans, MRIs, and PET scans. If we need cures, antibiotics and antiviral medications for infections are available. Lasik surgery and lens replacements correct eye disorders. Radiation, proton therapy, chemotherapy, immunotherapy, and stem cell transplants fight cancers. Stents, artificial valves, and artificial hearts address cardiac disease. Artificial joints replace damaged ones. Laparoscopy, arthroscopy, and video assisted thoracoscopy minimize invasiveness in surgical procedures.

We spend a good fraction of our lives in our homes, but few of us ever pause to consider how much money energy efficient devices save us. If we did, we’d be happily surprised. Refrigerators today consume only a third of the electricity than they did 40 years ago.3 And LED lights, which are rapidly becoming the norm, operate on a fifth of the power of comparable incandescent lamps.4 They last more than 40 times longer, to boot.

These are a just few compelling examples of how technology beneficially affects our lives as individuals. But on a larger scale, technology and its scientific underpinnings have had perhaps an even more profound impact. The industrial and service sectors of our economy have changed dramatically in the last quarter century. The change is happening today at warp speed, with many disruptive consequences, some good and some very socially and economically challenging.

Take the case of manufacturing. Today more than 12 million Americans work in industrial manufacturing plants, according to the Bureau of Labor Statistics, but that’s eight million fewer than in 1980. Manufacturing accounted for 19% of the workforce in 1980. Today, it’s scarcely 8%.

The causes of the decline, according to the populist mantra that helped elect Donald Trump president in 2016, are terrible trade pacts, lack of trade enforcement, currency manipulation, and cheap Asian and Mexican labor. While all of them contributed to the workforce falloff, another instigator that is still on the cusp didn’t make the list—automation.

While the data and economic analyses are still incomplete, there can be little doubt that, at the very least, automation has transformed the nature of the manufacturing workforce, altering it from less skilled to more skilled, and making greater demands on worker competencies. Automation is also very likely to be the principle driver of increased productivity—output per worker—in the future, increasing corporate profits and making companies more competitive globally.

As the United States began shedding manufacturing jobs more than a half century ago, the service sector picked up the slack. Even in the 1980s, professional, business, education, and health service jobs seemed immune to the outsourcing exodus to other countries that the manufacturing sector was experiencing. And economists spoke glowingly about a service-dominated American economy that would lead the world for years to come. But that was before the information technology revolution enabled globalized commerce.

Cheaper and more powerful computers connected to the Internet via communications satellites made it possible for service companies to outsource their work to regions of the world where labor was cheaper. Spanning many time zones, the global network also enabled companies to provide 24-7 service to their customers. But the benefits to the companies, and their customers, came at a cost to American workers—loss of jobs that were once sheltered from outsourcing to other countries.

There are two other arenas that technology has transformed remarkably. The first should come as no surprise: national security. World War II was arguably the first major conflict resolved by technology, principally the atomic bomb and radar—although cyber cryptography should be on the list, as well. After the Allied victory, the United States embarked on a major expansion of military research and development (R&D) that has carried through to the present.

The products of defense R&D have been transformative: thermonuclear weapons, ballistic missiles, missile defense systems, laser guided weaponry, stealth aircraft, GPS, drones, spy satellites, and night goggles are just a few of the many examples. Although they were developed for military applications, a number of the technologies have found their way into the consumer market place, prime among them GPS and the Internet—its defense progenitor was known as ARPANet—that are now ubiquitous.

The second arena that has undergone a revolution caused by disruptive technology is probably less obvious, except to the people who are employed by it: banking and finance. Without computers and global communication tools, Wall Street, London’s The City, and financial centers throughout the world would be a shadow of the industry they have come to represent.

Trading of complex, often opaque, derivatives developed by “quants”—as quantitative analysts are commonly called—would be virtually impossible without the use of powerful computers. Once financial trading floors were filled with shouts and paper slips. No more.

Today, the trading floors of banks and hedge funds are filled with computer monitors, keyboards and bits. Stocks, bonds, commodities, and currencies are bought and sold at almost the speed of light, and the buyers and sellers might be half a world away from each other. High-speed trading has become the stock and trade of the best arbitrageurs who take advantage of the small spreads in prices that come and go faster than the blink of an eye.

Technology has generated tremendous wealth for many of the financial players, but it has also enabled some of the riskiest practices pursued by banks that are “too big to fail.” Those practices produced the global financial crisis of 2007–2008, and the great recession that followed. Wall Street has recovered—with a big boost from the federal government—but Main Street and large parts of the world are still reeling from the fallout.

Arguably, the populist rebellion that began in the United States as the Tea Party movement in 2010 and ultimately propelled Donald Trump into the White House in 2017 has its roots in the great economic meltdown and the uneven recovery that followed it. But beneath the surface, the populist movement is a reflection of the impact of technology on American life and the challenges the average person faces in adjusting to the changes that are transpiring at an ever-increasing pace.

We’ve taken a bird’s eye view of how technology and its science partner have transformed America and the world in the span of a little more than half a century. And there is no evidence that transformations will abate anytime soon.

Science and technology are crucial cogs in 21st century life. But their impacts are not accidental. They are the result of policy decisions made in the public and private arenas. If those decisions properly take into account the roles of science and technological innovation, they can have tremendously beneficial effects. If they don’t, the impacts can be devastating.

The decision we made at the end of World War II, for example, to invest public money in “basic” research—which has no goal other than expanding fundamental knowledge—led eventually to the development of the laser and medical diagnostic tools such as CT scanners and MRI machines. Today, laser-enabled technologies, among them fiber optics communication, precision manufacturing devices, and DVD players—account for a staggering one-third of the American economy.5 Without access to CT scanners and MRI machines, doctors often would be basing their diagnoses on information and observations that are far from complete, as they did decades ago.

The policy decision to make basic research a federal priority traces its lineage to Vannevar Bush, an engineer who headed the wartime Office of Scientific Research and Development (OSRD) during World War II. As the war was nearing its end, President Franklin Roosevelt asked Bush to consider what could be done to bolster American science, which had been so instrumental in the Allies’ win. By the time Bush finished his report, Science, The Endless Frontier,6 in July of 1945, Roosevelt had died, but President Harry Truman accepted Bush’s principal recommendation: that the federal government expand its support of scientific research. It led to the creation of the National Science Foundation—although the path was tortuous—and the establishment of major research portfolios in other federal agencies.

Bush drew heavily on his wartime experience and his professional career in industry. He also recognized that the political climate was ripe for his policy proposal. He knew that Roosevelt trusted both his knowledge and his judgement, and he assumed correctly that Truman would try to follow in Roosevelt’s footsteps. The result was a game changer that kept the United States at the forefront of scientific discovery and technological innovation for more than half a century, spurring economic growth and safeguarding the nation militarily.

The financial policies advanced by President Bill Clinton and strongly supported by many members of Congress in the 1990’s offer a study in contrast. In essence, the president and congressional Republicans, especially, accepted the proposition that banking had undergone a major transformation. No longer restricted to operating within individual states, banks not only were involved in financial transactions across state lines, but they had become global players. In order to compete internationally, they argued, they had to be allowed to grow, and one way to grow was to allow them to engage in both commercial (retail) and investment operations.

The Glass-Steagall Act,7 which had been in place since 1933, prohibited them from doing so, because the combination of both activities under one roof was widely seen as leading to the risk-taking that culminated in the 1929 financial crash and the Great Depression that followed. But by 1999, most policymakers in Washington believed that market transparency and federal banking oversight were sufficient to keep risky behavior in check. They accepted Wall Street’s argument that globalization of finance had altered the banking calculus. Without Glass-Stegall repeal, American banks could not compete with other securities firms at home, and especially abroad, at least that’s what the banks asserted.

The Gramm-Leach-Bliley Act,8 properly known as the Financial Services Modernization Act of 1999, passed the Senate on November 4 of that year, and President Clinton signed it into law 8 days later. The arguments that led to the passage of the bill, which tore down many of the Glass-Stegall barriers, sounded persuasive, but policymakers failed to take into account how much computers had changed the trading landscape.

Collateralized Debt Obligations (CDO) bundled mortgages into mortgage-backed securities (MBS) in such an opaque way that risk assessments became difficult. Synthetic CDOs bundled many CDOs into even more opaque securities. Credit Default Swaps (CDS), essentially insurance policies without any asset backing, were bought and sold in the darkness of private trades. Derivatives used complex mathematical algorithms—often only understood by the quants who had devised them—to combine a wide assortment of financial instruments into a single trading package.

Could all of these instruments have been created and traded without the available computer technology? Perhaps, but it would have been far more difficult. Certainly, the speed and globalization of the trading would have been nearly impossible. When the American subprime mortgage market collapsed, American banking faced the prospect of another 1929, but with technology-driven globalization, so too, did the rest of the world.

Many factors contributed to the 2007–2008 financial crisis, some of them completely unrelated to technology. But technology did play a role that policymakers seemingly never recognized when they began loosening the regulatory strictures on the finance industry.

Science and technology policy intersects modern life in countless ways. Once you understand those intersections and where they fall on the scientific, economic, business, and political landscapes, you will have a firmer grasp of the challenges facing 21st century America and the road to effective solutions. You will become more successful in your professional life and more engaged as a citizen. You will be able to open doors and expand your horizons.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset