Chapter 4

A Glimpse at the History of Technology

Abstract

In this chapter, technology will be discussed, as obviously the cyber security officer must understand technology, which includes hardware, software, firmware, and all related aspects.

The revolution in technology has obviously caused nation-states, corporations, and individuals to become more technology-driven, technology-supported, and technology-dependent.

It is not the intent here to provide a detailed history of technology. The intent is to provide a brief overview. This overview is provided because it is obviously important for those involved in cyber security to understand their working environment as much as possible. It may seem obvious, but it is amazing how many cyber security officers have little knowledge of technology and how we got to where we are.

Keywords

Advanced Research Project Agency (ARPA); Gopher; Hacker tools; High-Tech; Internet protocols; Internet service providers (ISPs); Microprocessor; Processor serial number (PSN); Technology; World Wide Web

What hath God wrought?

Samuel F.B. Morse (When the first telegraph message ever was sent, 1844)

Chapter Objective
In this chapter, technology will be discussed, as obviously the cyber security officer must understand technology, which includes hardware, software, firmware, and all related aspects.
The revolution in technology has obviously caused nation-states, corporations, and individuals to become more technology-driven, technology-supported, and technology-dependent.
It is not the intent here to provide a detailed history of technology. The intent is to provide a brief overview. This overview is provided because it is obviously important for those involved in cyber security to understand their working environment as much as possible. It may seem obvious, but it is amazing how many cyber security officers have little knowledge of technology and how we got to where we are.

What Is Technology?

According to one dictionary,1 technology is defined as follows:
tech·nol·o·gy [tek näl′ ə je¯image ] (plural tech·nol·o·gies) noun
1. Application of tools and methods: the study, development, and application of devices, machines, and techniques for manufacturing and productive processes • recent developments in seismographic technology
2. Method of applying technical knowledge: a method or methodology that applies technical knowledge or tools • a new technology for accelerating incubation “…Maryland-based firm uses database and Internet technology to track a company’s consumption of printed goods….” Forbes Global Business and Finance, November 1998.
(Early seventeenth century. From Greek tekhnologia, literally “systematic treatment,” literally “science of craft,” from tekhne “art, craft.”)

From Cave Man to Cyber Security Professional and Information Warrior

The world is rapidly changing. We humans are in the midst of, or have gone through, a hunter–gatherer period, an agricultural period, an industrial period, and now the modern nation-state, and our society is in an information-based and information-dependent period. Some are saying that we are approaching the Knowledge Age—not to be confused with a “smarter age”!
Our global society can no longer function without the aid of automated information and high technology—computers and networks. With computers and global networks such as the Internet come opportunities to make life better for all of us. However, it also makes each of us more vulnerable and increases the risk to the high technology we depend on, as well as increasing risks to cyber security, our personal freedoms, and our privacy.
Throughout human history, technology has played a role in the development of our species, and it has played a major role in our lives. Even the making of fire was probably seen as a technological wonder in the early history of the human race—and also used as a weapon of war such as by setting fire to the enemy’s fortifications, houses, and crops. It was also used to help forge tools as weapons of war.
A short look back at that history is appropriate, for as someone once said: “If you don’t know where you’ve been, you don’t know where you are going”—and one might add, “you don’t even know where you are.” And if you do not know where you are, your survivability in a cyber security environment is not good.
Technology drives change.

Andrew Grove, CEO, Intel Corporation

Revolutions and Evolutions in High Technology

As was previously mentioned, one cannot address the issue of cyber security without first addressing the changes brought on by technology and its impact on businesses, government agencies, societies, global and economic competition, and the world in general. Technology obviously has a major impact on cyber security and the cyber security officer’s ability to successfully protect information and networks.
Technology has many uses, and over the centuries it has driven how we humans work, live, and interact. In a speech televised on the program BookTV, as far back as April 4, 2002, Michael Eisner, Chairman and CEO, The Walt Disney Company, discussed the impact of technology on the world and used the following timeline of the beginnings of communication-related “devices”—which is as relevant today as it was then:
• 1455: Gutenberg Bible
• 1689: Newspapers
• 1741: Magazines
• 1892: Movies
• 1907: Radio broadcasts
• 1927: TV
• 1975: Microsoft
Look how far we’ve come in the last 40-plus years. All these forms of technology and communication systems have had a major impact on our lives throughout history. They not only entertain us, but also provide us with information. Some of the information processed, stored, and transmitted will be sensitive information that a company or government agency may want to keep private and not release to the general public. Consider this as a cyber security officer: If that private, sensitive information communicated to the public is about your company, how that information is obtained may indicate a vulnerability in an information protection process. If so, you have a serious problem. The freer a society is, the freer the news media will be, and consequently, the more challenging your job to protect sensitive business information. However, with that said, remember that as a cyber security professional, your job is also to protect the privacy of individuals in your company.
Some day, on the corporate balance sheet, there will be an entry which reads “Information”; for in most cases the information is more valuable than the hardware which processes it.

RADM Grace Murray Hopper, U.S. Navy

From the Twentieth Century to Today: Technology and the Advent of High Technology

The use of technology during the agricultural and industrial periods saw great numbers of new inventions and improvements in old technologies. This was also the time of the building of the great cities of the world, as well as their total destruction in global wars. Thus, technology for warfare had truly come of age. With the advent of the atomic and subsequent bombs, the entire world could now literally be destroyed. The period also saw great improvements in technology inventions and new inventions such as the telegraph, telephone, air transportation, and computers. This period saw increases in education, mass transportation, and exponential growth in communications—the sharing of information.
During this period, the sharing of information became easier owing to the improvement of communications systems, new communications systems, and increased consolidation of people into large cities. This also made it easier to educate the people in the needed skills for working in the more modern factories and offices of the period and for developing, improving, and implementing technologies.
The transition period from the Industrial Age to the Information Age in world history varies with each nation-state. In the United States, the well-known authors the Tofflers estimated the transition to take place about 1955, when the number of white-collar workers began to outnumber the blue-collar workers. Some nation-states are still in various phases of transition from the agricultural period to the industrial period to the information period.
No matter when a nation experiences this technology-driven transition, however, it will see, as the United States and other modern nation-states have seen, the most rapid changes in all aspects of human existence since humans first walked on this Earth—including how wars are prosecuted.
The twentieth century saw the rapid expansion and use of technology more so than all past centuries combined. It was also the beginning of the concentrated development of technology specifically to develop new and improved networks on a massive scale. This ushered in the era of modern warfare, an era that was sponsored primarily by governments and globally committed businesses that had the will and the means for such development, and these entities were able to use these new technologies to their agendas on a global scale.
Thus, the twentieth century was the true beginning of technology-based warfare. Owing to the technological improvement of older inventions (e.g., submarine, machine gun) and new inventions such as nuclear weapons, never before could so many be killed by so few. There were also the tanks, hand grenades, poison gases, and land mines that gave way to the chemical/biological/nuclear weapons, carpet bombings, smart bombs, and the beginning of true cyber security.
In 1962 … the CIA quietly contracted the Xerox Company to design a miniature camera, to be planted inside the photocopier at the Soviet Union’s embassy in Washington. A team of four Xerox engineers … modified a home movie camera equipped with a special photocell that triggered the device whenever a copy was made. In 1963, the tiny Cold War weapon was installed by a Xerox technician during a regular maintenance visit to the Soviet embassy.2

2 From an article by Dawn Stover in the January 1996 issue of Popular Science, entitled “The CIA’s Xerox Spy-cam.” Although dated, this indicates how far back government agencies have been involved in covert cyber operations. Imagine the progress they have made since then.

This period included many significant technology-driven inventions too numerous to mention here in their entirety. In the medical field alone, we have seen the rapid invention of literally thousands of new drugs, procedures, and devices, many of which saved possibly millions of lives over the years. Some other significant technologically driven inventions during this century include:
• Zeppelin
• Radio receiver
• Polygraph machine
• Airplane
• Gyrocompass
• Jet engine
• Synthetic rubber
• Solar cell
• Short-wave radio
• Wirephoto
The twentieth century saw the development and improvement of our modern era’s amazing electronic inventions leading to the computer and its peripherals:
Electronic amplifying tube (triode)Photocopier
Radio tunerComputer
RobotIntegrated circuit
Digital computerBASIC language
UNIVAC IFORTRAN
SputnikCompact disk
Explorer I satelliteComputer mouse
LaserComputer with integrated circuits
OS/360 IBM operating systemRAM, ROM, EEPROM
MinicomputerARPANET
Optical fiberDaisy wheel printer
Cray supercomputerFloppy disk
Space shuttleDot-matrix printer
IBM personal computerLiquid-crystal display
Videotape recorderComputer hard disk
Graphic user interfaceModem
Cathode ray tubeMobile phone
TelevisionTransistor
FM radioWorld Wide Web
Voice recognition machineBrowsers

Other Significant Twentieth-Century Technological Developments and Events

Some of the other significant technological events and inventions that took place in the twentieth century and have led to our rapidly changing information-based societies and information dependency, and assisted in the development of new methods of prosecuting warfare, include the following:3
1930: Shannon’s doctorate thesis explains the use of electrical switching circuits in modern Boolean logic.
1934: Computing–Tabulating–Recording becomes IBM.
1936: Burack builds the first electric logic machine.
1940: Atanasoff and Berry design a computer with vacuum tubes as switching units.
1943–1946: Mauchley, Eckert, and Von Neumann build the ENIAC, the first all-electronic digital computer.
1947: The transistor is perfected.
1955: Shockley Semi-Conductor founded in Palo Alto, California; Bardeen, Shockley, and Brattain share the Nobel Prize for the transistor.
1957: Fairchild Semi-Conductor is founded.
1962: Tandy Corporation buys chain of RadioShack electronic stores.
1964: Kemeny and Kurtz, Dartmouth College, develop the BASIC computer language.
1968: Intel is founded.
1969: Intel produces integrated circuits for Japanese calculators; Data General releases Nova.

High-Tech: A Product, a Process, or Both?

There is no universally accepted definition of “high-tech,” nor is there a standard list of industries considered to be high-tech. Today nearly every industry contains some element of technology, and even the most technologically intensive industry will include low-tech elements.
Nevertheless, several groups have developed lists of industries they consider high-tech using U.S. Standard Industrial Classifications (SIC).
The breadth of these lists depends on two factors: (1) the goals of the organization and its customers and (2) whether the organization ascribes to the argument that only industries that produce technology can be considered high-tech or to the argument that industries that use advanced technology processes can also be categorized as high-tech.
Any industry-based definitions of high-tech will be imperfect, but none of the definitions discussed here should be considered incorrect. The important factor to consider is the perspective from which any list is derived.
Most high-tech industry classifications have common elements, yet may vary significantly in scope. Let’s consider four classifications of high-tech industries developed by the following respected and often quoted organizations: the American Electronics Association (AEA), RFA (formerly Regional Financial Associates), One Source Information Services, Inc. (formerly Corp Tech), and the U.S. Bureau of Labor Statistics (BLS).
The different missions of these four organizations influence how they define high-tech. The AEA is a trade association made up of mostly electronics and information technology companies. Its members generally produce technology and ascribe to the limited definition of high-tech based only on the nature of an industry’s product rather than its process. RFA is a national consulting firm. Its clients include builders and contractors, banks, insurance companies, financial services firms, and government. The industries with the greatest growth potential and those reflective of their clients’ interests are included in RFA’s list of high-tech industries. While both the AEA and RFA have narrowly defined high-tech, One Source and the BLS use broader definitions that include industries with both high-tech products and processes.
One Source gathers and sells corporate information on technology firms for use in sales and marketing. As it has built its database of firms, One Source has expanded its list of what should be considered a high-tech industry. The BLS is a federal agency responsible for collecting and analyzing data on the national labor force. It has defined those industries with the highest concentration of technology-based occupations, such as scientists and engineers, as high-tech industries.

The Trade Association: AEA

The AEA released Cyberstates 4.0, its annual report on technology employment, based on the AEA’s limited definition of high-tech industries, which fall into three categories: (1) computers, communications, and electrical equipment; (2) communication services; and (3) computer-related services. The AEA’s list is the most restrictive of the four classifications. Absent from the list are areas such as drug manufacturing, robotics, and research and testing operations.

The Consulting Group: RFA

RFA’s high-tech sectors are similar to those selected by the AEA. However, RFA does not include household audio and video equipment or telephone communications, but adds drugs and research and testing services.

Information Provider: One Source

Unlike the short lists compiled by the AEA and RFA, the One Source list classifies 48 sectors as high-tech. Major additions include a number of manufacturing industries, such as metal products and transportation equipment, and several service industries.

The Research Group: BLS

BLS has further refined its high-tech industry definition by separating sectors into two groups. Those industries with a high concentration of research-oriented occupations are labeled intensive, while those with a lower concentration are considered nonintensive. The differences shown here illustrate why knowing how data are defined is essential to understanding what the data mean. Once again, those wishing for a simple answer will be frustrated. It is not the data that have failed them, but the reality of a complex system (the economy) and the human factor that must determine how to best reflect that system using data.
As we have found, trying to get a handle on this thing called technology, any kind of technology, is like grabbing air. Even low technology was once considered high technology in its day. For example, when the first plow was invented, it was probably considered a technological wonder. Then, after being hooked up to a horse or water buffalo, it increased the productivity of the farmers and it certainly drastically changed farming methods. When the wooden plow was integrated with a steel blade, certainly that was considered high technology in its day. One must remember that high technology today will undoubtedly be considered low technology 25–50 years from now. So, high technology is also based on a reference point and that reference point is time—perception and time are also key factors in cyber security.
As we see, it is not easy to come to grips with this phenomenon called high technology. For our purposes, a narrowly focused definition is better. In today’s world, the microprocessor drives the technological products that drive the Information Age and cyber security. So, we will define high technology based on the microprocessor. High technology is defined as technology that includes a microprocessor.

The Microprocessor

In 1971, Intel introduced the Intel 4004 microprocessor. This was the first microprocessor on a single chip and included a central processing unit, input and output controls, and memory. This made it possible to program “intelligence” into inanimate objects and was the true beginning of the technology revolution that has caused so many changes in the world and ushered in the beginnings of the age of cyber security.
The microprocessor was developed through a long line of amazing inventions and improvements on inventions. Without these dramatic and often what appear to be new, miraculous breakthroughs in microprocessor technology, today’s cyber security phenomenon would still be only in the dreams of science fiction writers, the likes of Jules Verne and George Orwell. However, because of the amazing developments in the microprocessor, cyber security is beginning to come to the forefront in modern-day governments and businesses.
Today, because of the microprocessor and its availability, miniaturization, power, and low cost, the world is rapidly developing new high-technology devices, procedures, processes, networks, and, of course, cyber security and conventional warfare weapons. The global information infrastructure (GII) is just one example of what microprocessors are making possible. The GII is the massive international connections of world computers that carry business and personal communication as well as that of the social and government sectors of nations. Some say that GII will connect entire cultures, erase international borders, support “cyber economies,” establish new markets, and change our entire concept of international relations.
The GII is based on the Internet and much of the growth of the Internet is in developing nations. The GII is not a formal project but it is the result of thousands of individuals’, corporations’, and governments’ need to communicate and conduct business by the most efficient and effective means possible. The GII is also a battlefield in the cyber security arena.

Moore’s Law

No discussion of high technology and cyber security weapons would ever be complete without a short discussion of Moore’s Law. In 1965, Gordon E. Moore, Director of Research and Development Laboratories, Fairchild Semiconductor, was asked by Electronics magazine to predict the future of semiconductors and its industry during the next 10 years. In what became known as Moore’s Law, he stated that the capacity or circuit density of semiconductors doubles every 18 months or quadruples every 3 years.4
The interesting thing about Moore’s comments is that they became sort of a high-technology driver for the semiconductor industry and, even after all these years, it has been pretty much on track as to how semiconductors have improved over the years. Its power, of course, depends on how many transistors can be placed in how small a space. The mathematical version of Moore’s Law is:
Bits per square inch = 2(time  1962)5
Some of the-high technology “inventions” of the twentieth century that depended on the microprocessor include the following:
Ethernet (1973)
Laser printer (1975)
Ink-jet printer (1976)
Magnetic resonance imager (1977)
VisiCalc (1978)
Cellular phones (1979)
Cray supercomputer (1979)
MS-DOS (1981)
IBM personal computer (PC) (1981)
Scanning tunneling microscope (1981)
Apple Lisa (1983)
CD-ROM (1984)
Apple Macintosh (1984)
Windows operating systems (1985)
High-temperature superconductor (1986)
Digital cellular phones (1988)
Doppler radar (1988)
World Wide Web/Internet protocol (HTTP); HTML (1990)
Pentium processor (1993)
Java computer language (1995)
Digital versatile disk or digital video disk (1995)
Web TV (1996)
The Pioneer 10 spacecraft used the 4004 microprocessor. It was launched on March 2, 1972, and was the first space flight and microprocessor to enter the Asteroid Belt.

Other Significant Twentieth Century High-Technology Developments and Events

Some of the significant high-technology computer events and inventions that took place in the twentieth century and led to our rapidly changing methods of prosecuting a war include:6
1971: Intel develops the 8008; Wozniak and Fernandez build the “Cream Soda Computer.”
1972: Kildall writes PL/1, the first programming language for the Intel 4004 microprocessor; Gates and Allen form “Traf-O-Data”; Wozniak and Jobs begin selling Blue Boxes.
1973: Wozniak joins HP; Kildall and Cooper build “astrology forecasting machine.”
1974: Intel invents the 8080; Xerox releases the Alto; Torode and Kildall begin selling microcomputers and disk operating systems.
1975: Microsoft (previously known as “Traf-O-Data”) writes BASIC for the Altair; Heiser opens the first computer store in Los Angeles.
1976: Kildall funds Digital Research; work on the first RadioShack microcomputer started by Leininger and French; first sale of the CPM operating system takes place.
1977: Apple introduces the Apple II; TRS-80 developed.
1978: Apple ships disk drives for the Apple II and begins development of the Lisa computer.
1980: HP releases the HP-85; Apple III is announced; Microsoft and IBM sign an agreement for IBM’s PC operating system.
1981: Osborne I developed; Xerox comes out with the 8010 Star and the 820 computers; IBM presents the PC.
1982: Apple Lisa is introduced; DEC develops a lines of personal computers (e.g., DEC Rainbow 100).
1983: IBM develops the IBM PC Jr.; Osborne files for Chapter 11 as the microcomputer market heats up.
1984: Apple announces the Macintosh microcomputer.
1986: Intel develops the 8086 chip.
1987: Intel develops the 8088 chip.
1990s: Intel, already the leader in microprocessors, announces the 286, 386, and 486 chips, followed rapidly by the Pentium chips now reaching speeds of 1.7 GHz as we enter the twenty-first century.
Moore’s Law is still holding true although some believe we will soon hit the silicon wall, based on the laws of physics. Some of these doomsayers have been saying such things for years. Others are more optimistic and believe that other materials will be found to replace silicon or that silicon will be somehow enhanced to “defy” the laws of physics. If the past is any clue to the future, the future of high technology will not be impaired by such minor impediments as the laws of physics.

The Internet

The real issue is control. The Internet is too widespread to be easily dominated by any single government. By creating a seamless global-economic zone, anti-sovereign and unregulatable, the Internet calls into question the very idea of the nation-state.7

John Perry Barlow


7 John Perry Barlow, “Thinking Locally, Acting Globally,” Time, January, 1996, p.57; as quoted on p. 197, The Sovereign Individual, by James Dale Davidson and Lord William Rees-Mogg, published by Touchstone, New York, 1999.

It is in the context of this phenomenal growth of high technology and human knowledge that the Internet arises as one of the mechanisms to facilitate sharing of information and as a medium that encourages global communications. The Internet has already become one of the twenty-first century’s cyber security battlefields.
The global collection of networks that evolved in the late twentieth century to become the Internet represents what could be described as a “global nervous system,” transmitting from anywhere to anywhere facts, opinions, and opportunity. However, when most people think of the Internet, it seems to be something either vaguely sinister or of such complexity that it is difficult to understand. Popular culture, as manifested by Hollywood and network television programs, does little to dispel this impression of danger and out-of-control complexity.
The Internet arose out of projects sponsored by the Advanced Research Project Agency (ARPA) in the United States in the 1960s. It is perhaps one of the most exciting legacy developments of that era. Originally an effort to facilitate sharing of expensive computer resources and to enhance military communications, it has, since about 1988, rapidly evolved from its scientific and military roots into one of the premier commercial communications media. The Internet, which is described as a global meta-network, or network of networks,8 provides the foundation on which the global information superhighway has been built.
However, it was not until the early 1990s that Internet communication technologies became easily accessible to the average person. Prior to that time, Internet access required mastery of many arcane and difficult-to-remember programming language codes. However, declining microcomputer prices combined with enhanced microcomputer performance and the advent of easy-to-use browser9 software as key enabling technologies created the foundation for mass Internet activity. When these variables aligned with the developing global telecommunications infrastructure, they allowed a rare convergence of capability.
E-mail. Although e-mail was invented in 1972, it was not until the advent of the “modern Internet system” that it really began to be used on a global scale. In 1987, there were approximately 10,000 Internet computer hosts and 1000 news messages a day in 300 newsgroups. In 1992, there were more than 1,000,000 hosts and 10,000 news messages a day in 1000 newsgroups. By 1995, the number of Internet hosts had risen to more than 10 million, with 250,000 news messages a day in over 10,000 newsgroups.10 By 2014, the majority of e-mail traffic originated from the business world, which accounted for more than 108.7 billion e-mails that were sent and received every day.11
Internet protocols. In the 1970s, the Internet protocols were developed to be used to transfer information.
Usenet newsgroup and electronic mail. Newsgroups and electronic mail were developed in the 1980s.
Gopher. In 1991, personnel at the University of Minnesota created the Gopher as a user-friendly interface that was a menu system for accessing files.
World Wide Web. In 1991, Tim Berners-Lee and others at the Conseil Européene pour la Recherche Nucleaire developed the Web. In 1993, the Web had approximately 130 sites; in 1994, about 3000 sites; in April 1998, this had grown to more than 2.2 million and in January 2015 it had reached 1,169,228,000.12
The most commonly accessed application on the Internet is the World Wide Web (WWW). Originally developed in Switzerland, the Web was envisioned by its inventor as a way to help share information. The ability to find information concerning virtually any topic via search engines, such as Google, Bing, Alta Vista, HotBot, Lycos, InfoSeek, and others, from among the rapidly growing array of Web servers is an amazing example of how the Internet increases the information available to nearly everyone. One gains some sense of how fast and pervasive the Internet has become as more TV, radio, and print advertisements direct prospective customers to visit their business or government agency Web site. Such sites are typically named www.companyname.com, where the business is named “companyname,” or www.governmentagency.gov for government agencies.
From the past century until now, the Internet has rapidly grown from an experimental research project and tool of the U.S. government and universities to the tool of everyone in the world with a computer. It is the premier global communications medium. With the subsequent development of search engines and, of course, the Web, the sharing of information has never been easier. Sites such as Google.com state that, in 2013 they searched through 30 trillion Web pages!
It has now become a simple matter for average people—even those who had trouble programming their VCRs—to obtain access to the global Internet and with the access search the huge volume of information it contains. Millions of people around the world are logging in, creating a vast environment often referred to as cyberspace and the GII, which has been described as the virtual, online, computer-enabled environment, as distinct from the physical reality of “real life.”
By the end of the twentieth century, worldwide revenues via Internet commerce had reached perhaps hundreds of billions of dollars, an unparalleled growth rate for a technology that has been really effective only since the early 1990s! The “electronic commerce” of the early twenty-first century already includes everything from online information concerning products, purchases, and services to the development of entirely new business activities (e.g., Internet-enabled banking and gambling).
An important fact for everyone to understand, and which is of supreme importance to those interested in cyber security, is that the Web is truly global in scope. Physical borders as well as geographical distance are almost meaningless in cyberspace; the distant target is as easily attacked as the local one.
The annihilation of time and space makes the Internet an almost perfect environment for cyber crime and warfare. When finding a desired adversary’s13 server located on the other side of the planet is as easy and convenient as calling directory assistance to find a local telephone number, information warriors have the potential to act in ways that one can only begin to imagine. Undeterred by distance, borders, time, or season, the potential bonanza awaiting the information warrior is a chilling prospect for those who are responsible for safeguarding and defending the assets of a business or government agency.
Because of religious beliefs in many faiths, Internet access to material considered pornographic is generally not acceptable. One of society’s struggles will be how to provide access to the world’s information without causing some moral decay of society. This will be a struggle for many countries and it is believed that the information warriors will have a major impact on the society of such developing countries.
The Internet is the latest in a series of technological advances that are being used not only by honest people to further their communication, but also by miscreants, juvenile delinquents, and others for illegal purposes. As with any technological invention, it can be used for good or for illegal purposes. It is really no different from other inventions such as the handgun. The handgun can be used to defend and protect lives or to destroy them. It all depends on the human being who is using the technology.

The High-Technology-Driven Phenomenon

There are thousands of Internet service providers (ISPs) operating and connected all across the globe. It is hoped that we all know by now that our e-mails do not go point to point, but hop around the Internet. They are susceptible to being gleaned by all those with the resources to read other people’s mail or steal information to commit crimes (e.g., identity theft, competitive intelligence information collections, and, of course, useful information for information warriors).
So, what is the point? The point is that there are ISPs all over the world with few regulations and absolutely no protection and defensive standards. Some ISPs may do an admirable job of protecting our information passing through their systems, while others may do nothing. Furthermore, as we learn more and more about “Netspionage” (computer-enabled business and government spying), we learn more and more about how our privacy and our information are open to others to read, capture, change, and otherwise misuse.
In addition, with such programs as SORM in Russia, Internet monitoring in China and elsewhere, global Echelon, and the U.S. FBI’s Carnivore (still Carnivore no matter how often they change the name to make it more politically correct), we might as well take our most personal information, tattoo it on our bodies, and run naked in the streets for all to see. Well, that may be a slight exaggeration; the point is that we have no concept of how well ISPs are protecting information belonging to governments, businesses, individuals, or associations. Through your ISP, how susceptible are you to the threats of cyber security? Do you know if your ISP is protecting or monitoring you? If it is monitoring you, for whom?

Faster and More Massive High-Technology-Driven Communications

We are quickly expanding into a world of instant messages (IMs) through ISPs. After all, the more rapidly our world changes, the more rapidly we want to react and we want everything—now! A 2014 report by Juniper networks stated that instant messaging apps will account for 75% of mobile messaging traffic, or 63 trillion messages, by 2018. Furthermore, they can be used to transfer files, send graphics, and, unlike the telephone and normal e-mails, with IM one knows whether the person being contacted is there. Interesting ramifications—check to see if a person is online; if not (after already setting up a masquerade or spoof), take over that person’s identity and contact someone posing as the other—instantly. Of course, there are perhaps hundreds, if not thousands, of examples of ISPs being penetrated or misused. As far back as approximately November 1995, for example, the Wall Street Journal ran a story entitled “America Online to Warn Users about Bad E-mail.” We all know about the basic issues of viruses and other malicious codes being sent via ISPs. So, the problem has existed for quite some time.
Solar Storms Could Affect Telecommunications. Intense storms raging on the sun … could briefly disrupt telecommunications …. The eruptions triggered a powerful, but brief, blackout Friday on some high-frequency radio channels and low-frequency navigational signals … forecast at least a 30 percent chance of continuing disruptions …. In addition to radio disruptions, the charged particles can bombard satellites and orbiting spacecraft and, in rare cases, damage industrial equipment on the ground, including power generators and pipelines.14

High technology is vulnerable to nature and the universe in general. What a great time to launch a cyber security attack on an adversary, including maybe competitors. Is it sunspots or an adversary causing these outages? By the time the adversary finds out it is you and not three days of sunspots, the war could be over.

The Beneficial Effect of Hacker Tools and Other Malicious Software on Network Security with Dual Roles as Cyber Security Tools

The following examples of malicious software were selected as a representative sample of those that are available and for their range of functionality and, additionally, for their range through time from 1991 to present. These tools can be and are being adapted and adopted for use in cyber security.15
Hacker tools. Of the hacker tools that were reviewed, while the intentions of the originators of the tools were mixed, with some being malicious and some well intentioned, they can all be used to strengthen the security of a network or to monitor the system for illicit activity. This can be achieved if the system owner uses hacker tools to identify the weaknesses that exist in the security of the system, to identify appropriate remedial action, before a person with malicious intent attempts to exploit the weaknesses. A number of the tools can also be used to monitor the system for illicit activity, even before software patches are available, so that the system owner can make informed decisions on appropriate action to prevent or minimize damage to his or her system. As a cyber security officer, how will you defend against such attacks?
Viruses. Viruses have no direct beneficial effect on the security of a system except to provide a visible indication that there has been a breakdown in procedures for the transfer of software or data between systems. The negative effect of viruses is the cost in terms of time and the antivirus software to check data and software being imported or exported to and from the system, as well as the cost of rectifying a problem when an infection has occurred, which can be considerable.
In an abstract way, the advent of the virus has actually been beneficial to the cyber security officer because the impact of a virus on the user is a visible and constant reminder of the need to observe good cyber security practices.
In the majority of cases, the virus is detected before it can activate its payload, so the damage is normally limited to the inconvenience and cost of the cleaning up the system to remove the virus. As a cyber security weapon, it is a valuable and cheap weapon that can cause devastating results against your unprepared information systems.
Worms. The release onto the Internet on November 2, 1988, of the Internet worm written by Robert T. Morris, Jr., quickly caused widespread disruption and the failure of a large proportion of the network that existed at that time. The problem was compounded by the fact that some of the servers that had not been affected were taken offline to prevent them from becoming infected, thus placing a higher load on already-affected sections of the system and denying those elements of the network that had gone offline access to the patches that would protect them, as the normal distribution method for patches was over the Internet itself. To date, there have been no security benefits derived from worms, other than, in the case of the Robert T. Morris worm, to highlight the urgent need for effective and early communication of information on incidents.
The potential for the use of this type of program in a way that would aid the security of systems has been postulated, in the form of autonomous intelligent agents that would travel through the system and report back predefined information, such as the system assets, the condition and identity of system elements, and the presence or absence of specific types of activity. As a weapon for prosecuting cyber security, worms have excellent potential and may even be considered a “weapon of mass destruction” because of the damage they can cause a high-technology, information systems-dependent adversary. Of course, we now have many “colored” worms being written and traveling around the GII, NIIs, and other networks.
Easter eggs. Easter eggs have no beneficial effect other than to highlight that even proprietary software can have large sections of code included in them that are redundant to the functionality for which they were intended and also that the quality control procedures for the production of software by well-known organizations is poor if the Easter eggs were not detected during production. Can you think of any way to use these “eggs” in a cyber security battle?
Trojan horses. The Trojan horse, by definition, carries out actions that are normally hidden from the user while disguising its presence as a benign item of software. They are difficult to detect because they appear to be a legitimate element of the operating system or application that would normally be found on the system. Given that the purpose of a Trojan horse is to hide itself and its functionality from legitimate users, there have been no beneficial effects derived from them—unless you are an information warrior. As a cyber security officer, you must defend against them.
Logic bombs. Logic bombs, as with Trojan horses, carry out actions that are unexpected and undesirable. Some may cause relatively minor damage, such as writing a message to a screen, while others are considerably more destructive. They are normally inserted by disaffected staff or by people with a grudge against the organization. Again, they are difficult to detect before they have been activated and, as a result, can be expensive to rectify. Logic bombs are correctly named as they can have the same effect against the system of an adversary as a physical bomb might have against a building—Boom! It is gone!
The clear implication from the issues discussed above is that some hacker tools can have a beneficial effect on the security of computer systems if they are used by the system staff before they are used by personnel either within the organization or outside it to identify shortcomings or flaws in the operating system or applications software, the configuration of the system, or the procedures used to secure it. Viruses, while providing no direct benefit, do provide a detectable indication that there has been a breach in the security of the system, either by an exploitation of a flaw in the security procedures or by a shortcoming in the system software (it allowed a virus through any barriers that had been created to prevent access to the system).
Worms currently have no beneficial effect on system security management. However, the concept that was used to disseminate the Robert T. Morris worm may have an application in the mapping of large networks if applied to autonomous agents. The Trojan horse and the logic bomb, which, by their very nature, are covertly inserted into the system without the owner’s knowledge, have no beneficial effect and have only malicious applications.

Other High-Technology Tools in Cyber Security

Cyber wars (information warfare) through technology are being fought on many fronts—on the personal privacy, corporate Netspionage,16 and nation-state battlefields of the world. Even such innocent-sounding words as “cookies” take on new meaning in the cyber security arena.
These cookies—the computer kind, not the ones you eat—are beneficial, except when they are used to profile customer habits and gather an individual’s private information, which is then sold. High-technology cookies are files that a Web site can load onto a user’s system. They are used to send back to the Web site a user’s activity on that Web site, as well as what Web sites the user has previously visited. They are also a potential tool of the information warrior.
Intel’s Pentium III included a unique processor serial number (PSN) in every one of its new Pentium III chips. Intel claimed that the PSN could identify an individual’s surfing through electronic commerce and other Internet-based applications. It was noted that by providing a unique PSN that can be read by Web sites and other application programs, it could make an excellent cyber security tool. Although this number is designed to be used to link user activities on the Internet for marketing and other purposes, one can easily imagine other uses, from a cyber security perspective, that can be made of this high-technology application. And as for Microsoft’s new operating system, XP, imagine the IW possibilities.
Steganography is another use of high technology that can be used in cyber security:17
Hiding information by embedding a file inside another, seemingly innocent file is a technique known as “steganography.” It is most often used with graphics, sound, text, HTML, and PDF files. Steganography with digital files works by replacing the unused bytes of data in a computer file with bytes that contain concealed information.
Steganography (which translated from Greek means covered writing) has been in use since about 580 B.C. One technique was to carve secret messages into wooden objects and then cover the etched words with colored wax to make them undetectable to an uninitiated observer. Another method was to tattoo a message onto the shaved messenger’s head. Once the hair grew back, the messenger was sent on his mission. Upon arrival, the head was shaved, thus revealing the message—obviously not time-dependent. The microdot, which reduced a page of text to the size of a typewriter’s period so that it could be glued onto a postcard or letter and sent through the mail, is another example.18
Two types of files are typically used when embedding data into an image. The innocent image that holds the hidden information is a “container.” A “message” is the information to be hidden. A message may be plaintext, ciphertext, other images, or anything that can be embedded in the least significant bits of an image.19
Steganographic software has some unique advantages as a tool for Netspionage agents. First, if the agents use regular cryptographic software on their computer systems, the files may not be accessible to investigators but will be visible, and it will be obvious that the agents are hiding something. Steganographic software allows agents to “hide in plain sight” any valuable digital assets they may have obtained until they can transmit or transfer the files to a safe location or to their customer. As a second advantage, steganography can be used to conceal and transfer an encrypted document containing the acquired information to a digital dead drop. The agents could then provide the handler or customer with the password to unload the dead drop but not divulge the steganographic extraction phrase until payment is received or the agents are safely outside the target corporation. As a final note, even when a file is known or suspected to contain information protected with steganographic software, it has been almost impossible to extract the information unless the passphrase has been obtained.

Welcome to the Twenty-First-Century Technology

As we left the twentieth century and began the twenty-first century, our dependence on technology continued to increase as well as our interconnectivity on a global basis, our integration of devices–or platforms–and use of wireless, mobile technology. This has increased our vulnerability to successful attacks on a global scale. It has also made protection of our systems, information, etc., much more difficult—maybe even impossible.
As we progress into the twenty-first century, we continue to fall behind in our defenses and ability to react quickly and successfully to attacks from around the world. As the sophistication of attacks continues to increase so do the vulnerabilities of our vital information infrastructures.
Top cyber security experts echoed a dire warning from a top intelligence chief on the vulnerability of the U.S. power grid, with one telling FoxNews.com that state-sponsored hackers could send America’s nerve centers on an “uncontrollable, downward spiral.”20

Defending our information has been made more difficult by advances in technology and also in social networks of all kinds, through which users continue to innocently provide information that is very useful to competitors and other adversaries and that leaves individuals, groups, corporations, and governments more open to attacks.
Let’s Look at Some of the Major Technology Advances Thus Far in the Twenty-first Century:
The power of cell and Wi-Fi phones as they have become not only telephones, but more all-in-one communication devices, for example, voice, text, e-mail, storage devices, and video and digital cameras. Not far behind are the tablets, which offer the same mobility as cell phones but bigger screens and often more power, storage, memory capacity, and speed.
Twitter, Facebook, You Tube, blogs, and others offer social connectivity as never before by which individuals, businesses, and governments on a global, mobile scale share information that includes accidentally or purposefully posting sensitive or maybe even classified information as users go unchecked. It is also a great platform for blackmail, marketing, and spreading false information or propaganda and of course for collecting information useful in GIW and conventional wars and battles.
More sophisticated game machines and games that can be used to help train info-warriors and in fact are being used to do so.
Driverless vehicles, including trams, trains, and cars, that are turning into computers on wheels. They are loaded with technology. Imagine once they are taken over, controlled by a terrorist, they can easily be turned into weapons, giving new weapons status as car bombs with which the drivers do not have to sacrifice their lives.
Electric vehicles over time will become more prevalent. Since we are unable to store electricity as well as we can gas, what would happen to our ability to use electric vehicles, especially for emergencies, once our power grids go down and they cannot be recharged. As we race to be “eco-friendly,” are we considering what to do to mitigate this up-and-coming vulnerability? No, of course not.
We are also approaching the time when we will truly be able to use artificial intelligence and possibly become dependent on it. What happens when that happens and it is taken over and changed by info-warriors and made into weapons support?
The use of nano-technology will continue to be enhanced and as it is, it can be embedded in our infrastructures to destroy them or injected into our bodies. Also, as we depend more on robotics from manufacturing to medical devices, even for surgeries, what happens when they are taken over by info-warriors?
Looking back at what has been accomplished just in our short lifetimes, imagine the twenty-first-century technology and the cyber security-related implications coming in the future.

Summary

If you are involved in any activity in which technology is used as a tool to help you accomplish your work, you are aware of the tremendous and very rapid advances that are being made in that arena. It is something to behold. We are in the middle of the most rapid technological advances in human history, but this is just the beginning. We are not even close to reaching the potential that technology has to offer, nor its impact on all of us—both good and bad.
It is said that there have been more discoveries in the past 50 years than in the entire history of mankind before that time. We have just to read the newspapers and the trade journals to look at every profession and see what technology is bringing to our world. There are new discoveries in medicine, online and worldwide information systems, the ability to hold teleconferences across the country and around the globe, and hundreds of other examples that we can all think of.
High technology is the mainstay of both our businesses and our government agencies. We can no longer function in business or government without them. Pagers, cellular phones, e-mail, credit cards, teleconferences, smart cards, tablets and notebook computers, networks, and private branch exchanges (PBXs) are all computer based and all are now common tools for individuals, businesses, and public and government agencies. Information warriors are also relying more and more on computers. As computers become more sophisticated, so do the information warriors. As international networks increase, so does the number of international information warriors.
Networking and embedded systems, those integrated into other devices (e.g., automobiles, microwave ovens, medical equipment), are increasing and drastically changing how we live, work, and play. According to a study financed by the U.S. ARPA and published in the book Computers at Risk:
Computers have become so integrated into the business environment that computer-related risks cannot be separated from normal business risks or those of government and other public agencies.
Increased trust in computers for safety-critical applications (e.g., medical) leads to the increased likelihood that attacks or accidents can cause deaths. (Note: It has already happened.)
Use and abuse of computers are widespread with increased threats of viruses and credit card, PBX, cellular phones, and other frauds.
An unstable international political environment raises concerns about government or terrorist attacks on information and high-technology-dependent nations’ computer and telecommunications systems.
Individual privacy is at risk owing to large, vulnerable databases containing personal information, thus facilitating increases in identity theft and other frauds.
If I want to wreak havoc on a society that, in some cases, has become complacent, I am going to attack your quality of life.

Curt Weldon, R-PA. U.S. House, Armed Services Committee21


21 Speaking at an InfoWar Conference in Washington, D.C., in September 1999.

Personal computers have changed our lives dramatically and there is no end in sight. High technology in general has improved the quality of life for societies and made life a little easier, and yet it makes an information-dependent way of life more at risk than ever before. The use of modems has become commonplace, with all newly purchased microcomputer systems22 coming with an internal modem already installed and ready for global access through the Internet or other networks. Wireless networks are being increasingly used and there are now millions of Wi-Fi “hot spots” to which people can connect their phone, laptop, or tablet wherever they are. Therefore, these devices and the networks that they are using potentially represent some of the most serious and complex crime scenes of the Information Age. This will surely increase as we begin the twenty-first century.
… it is computerized information, not manpower or mass production that … will win wars in a world wired for 500 TV channels. The computerized information exists in cyberspace—the new dimension created by endless reproduction of computer networks, satellites, modems, databases, and the public Internet.23

Neil Munro


23 Neil Munro, “The Pentagon’s New Nightmare: An Electronic Pearl Harbor,” Washington Post, July 16, 1995, p. C3.

High-technology development continues to play a dual role in information-based nation-states. The high-technology devices have been turned into tools that have been used to determine the adequacy of cyber defenses and have been adopted and adapted by global hackers, terrorists, and other miscreants. They now have been using those tools for probing and attacking systems, especially through the Internet interfaces of corporations and nation-states, as well as the GII and NIIs of nation-states. These same hacker techniques have been readily adopted and enhanced by the information warriors of nation-states and others.

1 Encarta World English Dictionary, 1999, Microsoft Corporation. All rights reserved. Developed for Microsoft by Bloomsbury Publishing Plc.

3 See P. Freiberger and M. Swaine’s book, Fire in the Valley: The Making of the Personal Computer, Osborne/McGraw, Berkeley, CA, 1984.

4 Schaller, Bob, “The Origin, Nature, and Implications of ‘MOORE’S LAW’: The Benchmark of Progress in Semiconductor Electronics,” September 26, 1996, http://research.microsoft.com/en-us/um/people/gray/moore_law.html.

5 Winfred Phillips, “Chapter 2 - Computers and Intelligence,” The Mind Project, http://www.mind.ilstu.edu/curriculum/extraordinary_future/PhillipsCh2.php?modGUI=247&compGUI=1944&itemGUI=3397.

6 See P. Freiberger and M. Swaine’s book, Fire in the Valley: The Making of the Personal Computer, Osborne/McGraw, Berkeley, CA, 1984, and http://www.swaine.com/wordpress/tag/mike-swaine/ for additional details of computer history.

8 Ibid., p. 11.

9 Software that simplifies the search and display of World Wide Web-supplied information.

10 Internet Guide by Microsoft Personal Computing, http://www.microsoft.com/magazine/guides/internet/history.htm.

11 “Email Statistics Report, 2014–2018,” The Radicati Group, http://www.radicati.com/wp/wp-content/uploads/2014/01/Email-Statistics-Report-2014-2018-Executive-Summary.pdf.

12 Internet live stats, http://www.internetlivestats.com/total-number-of-websites/.

13 The term “adversary” is used more often these days to describe an enemy than the word “enemy” because it seems it is not as harsh a term, although the intent is still to disable or kill them.

15 A number of other tools were reviewed but contained no obvious property or functionality that was considered to be both beneficial and a potential cyber security weapon; that is, they modified the system to exploit vulnerabilities or they were purely malicious and caused a denial of service. These are tools that are “pure” cyber security tools.

16 See the book, Netspionage: The Global Threat to Information, published by Butterworth–Heinemann in September 2000.

17 Excerpt taken from the book, Netspionage: The Global Threat to Information, published by Butterworth–Heinemann in September 2000, and reprinted with permission.

18 Steganography, http://www.webopedia.com/TERM/S/steganography.html.

19 Steganography, http://www.jjtc.com/Steganography/.

22 Microcomputers had been a term used to differentiate them from minicomputers and mainframe computers. The computers’ power and what the manufacturers decided to call them differentiated these systems. However, with the power of today’s microcomputer equaling that of larger systems, the issue is unclear and basically no longer very relevant. What these systems are called, coupled with notebooks, PDAs, workstations, desktops, etc., is not that important because they all basically operate the same way.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset