Appendix A: What Is Quantum Computing?

Philip L. Frana, Associate Professor of Interdisciplinary Liberal Studies & Independent Scholars, James Madison University

Quantum computing is a fundamentally unique way of processing information and calculating solutions to problems. Quantum computers are capable of operating in an extremely large number of states simultaneously, while classical computers can operate in only one state at any given moment. Because they work in a different way, many scientists believe that these quantum computers can deliver exponential speedups and solve problems that elude classical computers.

Classical computers ushered in the current Information Age, with all of its revolutionary digital advances: personal computing, Internet communication, smartphones, machine learning, and the knowledge economy generally. Classical computers encode and manipulate data in units known as bits. Today, these traditional general-purpose machines use billions of semiconductor parts known as transistors to switch or amplify electrical signals. A classical bit, like the power switch on your favorite electronic device, can be in one of two states at any given time, either 0 or 1. This is why classical information processing is said to be binary.

How Quantum Computing Works

Quantum computers process information by exploiting the actions of subatomic particles, for example, electrons, ions, or photons. Quantum computers store information in quantum registers, which are in turn composed of quantum bits or qubits. These qubits are bounded only by the physical limits of superposition, entanglement, and interference. Superposition is a nonintuitive property of the subatomic world that permits qubits to exist in multiple states until some external measurement is taken. In the world of the quantum, for example, the state of an electron may be the superposition of the properties “spin up” and “spin down.” A common analogy is Schrödinger's cat, which is both dead and alive until an observer peers inside the box. Qubits in superposition may be in a 0 state or a 1 state, but they may also be pointing in any other direction, which might be thought of in the quantum sense as some complex linear combination of 0 and 1. When the qubit is measured, the in-between “hidden” information collapses, and the new state will be binary, depending on whether the quantum state was closer to 0 or 1; if its amplitude is exactly in the middle, there is an equal probability of its resolving to either state. Upon measurement, the qubit becomes a classical bit.

Entanglement is another property of quantum physics that involves pairing and connection between particles in such a way that they cannot be described independently, even sometimes over great physical distances. Albert Einstein described entanglement as “spooky action at a distance.” Bits in classical computers are independent from one another; a single bit does not exert any influence over any other. This is not true of quantum computers. In quantum computing, qubits can become entangled in such a way that they fall into a shared quantum state. Entangled qubits are no longer independent; manipulating one qubit can affect the probability distribution of the whole system. The number of states also becomes larger. One qubit is capable of holding two states at the same time (0 and 1). Two qubits can hold four states, three qubits give you eight states, four qubits sixteen states, and so forth. Sixty-four qubits yields 18,446,744,073,709,551,616 states, which a personal computer operating at a normal speed could cycle through in about 400 years. Each time a qubit is added, the number of simultaneous states doubles in a quantum computer, representing a huge advantage over a classical computer, which can be in only one state at a time. Theoretically, a quantum computer not affected by decoherence and noise (described in a moment) possesses truly massive processing power; 300 qubits could examine more possibilities than the number of atoms in the observable universe.

The final property of quantum mechanics that affects the operation of a quantum computer is interference. The mathematical description of qubits is represented in quantum mechanics by the wave function, a variable quantity that describes the isolated state of a quantum system of entangled qubits. When the wave functions of all of the entangled qubits are added together, we have both a description of the state of the quantum computer and also of interference. A common analogy here is the pattern of ripples of a body of water: sometimes the ripples join to make a bigger wave and sometimes when they come together produce stillness. Constructive interference increases the probability that the quantum computer's answer to a problem will be correct; destructive interference decreases that probability. Quantum algorithms are designed to choreograph this constructive and destructive interference and increase the probability that a qubit system collapses into useful measurement states. When contributions to the amplitude of entangled qubits reinforce one another, the probability of the right solution being recognized when the quantum computer's operator seeks a measurement is greatly increased. One of the tricks that can be carried out on a quantum computer is called inversion about the mean. One pass through the quantum circuit is unlikely to meaningfully increase the wave function value for the right answer. Too many iterations through the circuit can actually decrease the probability of rotating the initial state closer to the winner. In other words, an optimal mathematical floor and ceiling exists that increases the probability of identifying the correct item when the measurement is taken.

Origins of Quantum Computing

The original idea for a quantum computer is ascribed to Soviet mathematician Yuri Manin who suggested the possibility in the introduction to his book Computable and Uncomputable in 1980. That same year, American physicist Paul Benioff, working at the French Centre de Physique Théorique, produced a paper in which he described a quantum mechanical model of a Turing machine. The very next year, Benioff and American theoretical physicist Richard Feynman delivered separate talks on quantum computing at the first Conference on the Physics of Computation at MIT. In his lecture “Simulating Physics with Computers,” Feynman famously interjected a comment about how simulating a quantum system necessitates the construction of a quantum computer: “Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly it's a wonderful problem, because it doesn't look so easy.”

Benioff and Feynman's papers fired the imaginations of scientists in the final decades of the 20th century. British theoretical physicist David Deutsch hoped that such a computer would make it possible to test the “many-worlds interpretation” of quantum physics, in which multiple universes are said to exist across space and time in parallel with our own universe. Deutsch advanced the idea of a quantum Turing machine (QTM), the first general and fully quantum model for computation in a 1985 paper published in the Proceedings of the Royal Society. By 1992, Deutsch and Australian mathematician Richard Jozsa found a computational problem that could be efficiently solved on a universal quantum computer with their Deutsch-Jozsa algorithm. The problem they identified cannot, it is thought, be solved efficiently on a classical computer. For this work, Deutsch is called the “father of quantum computing.”

Examples of Quantum Speedup: Shor's and Grover's Algorithms

Other researchers began developing their own quantum computer models and algorithms. One of the most famous is Shor's algorithm. In 1994, AT&T Bell Labs' applied mathematician Peter Shor unveiled a method for factoring large integers in polynomial time. For our purposes, polynomial-time algorithms can be thought of as “efficiently solvable” or “tractable.” They are, as the NIST Dictionary of Algorithms and Data Structures defines them, “reasonable to compute”.[1] Factorization consists of breaking down a number into smaller numbers that, when multiplied together, return the beginning number. It is trivially easy to multiply the small numbers (factors) together to produce the original number, and the traditional algorithm for doing so is fast, efficient, and known to every schoolchild. However, finding the original factors of numbers, and in particular very large numbers, is much more difficult. This is because the search space of possible factors is also very large.

Factoring and prime numbers are useful mathematical properties that are routinely used to secure communications on classical computers. Prime factorization—breaking down a number into the set of prime numbers that result in the original number when multiplied together—takes a very long time and in fact demands an algorithm that grows exponentially in running time as a function of the original number's size. In so-called public key cryptography, one person possesses a “public key,” which is the product of two large primes. The public key is used to encrypt the message, and the two primes are used to decrypt the message. No published classical algorithm exists to find the prime factors of a public key in polynomial time; polynomial-time algorithms are informally described as “fast.” Factoring a good public key is impractical because while it can be done, it takes too long using a classical computer searching for the original primes.

Theoretically speaking, Shor's algorithm can break existing encryption systems because factorization of primes can be achieved in polynomial time, which obviously excites tremendous interest among cryptographic specialists and anyone who wants to keep a vitally important system like email, an online bank account, or a nuclear weapons facility secure. Peter Shor's “fast” quantum algorithm attacks an exponential-time problem in mathematics, but it is too early to worry about a working quantum computer breaking current advanced encryption schemes. The key number size of public-key encryption systems like RSA continues to grow, which means that factoring time also increases. Also, while it might be possible to break encryption with something on the order of many thousands of qubits, noise and error-correcting codes will mean that significantly more qubits are needed. The current generation of universal quantum computers have no more than approximately 100 qubits; an encryption-breaking quantum computer would require a million qubits. Google Quantum AI recently announced at its annual developer conference that it intends to build a million-qubit machine by 2030, so perhaps we are living on borrowed time.

In 1996, another Bell Labs researcher, Lov Grover, presented the path-breaking paper “A Fast Quantum Mechanical Algorithm for Database Search” at the ACM Symposium on the Theory of Computing. This was followed by a more comprehensible piece in Physical Review Letters called “Quantum Mechanics Helps in Searching for a Needle in a Haystack.” The advantage of Grover's quantum database search algorithm is that it provides a quadratic speedup* for one-way function problems usually accomplished by random or brute-force search; one-way function problems could involve searching for an item in an unsorted or unstructured list, optimizing a bus route, or solving a classic Sudoku puzzle. In other words, Grover's algorithm may be applied when a function is true for one input in the entire potential solution space, and false for all of the others. Rather than guessing one by one, which gives little information on what the right answer might be, Grover's algorithm leverages qubit superposition and interference to adjust the phases of various operations, increase the amplitude of the right item, and iteratively check and remove states that are not solutions. A measurement of the final state of the quantum computation returns the right item with certitude. Grover's algorithm works powerfully on computational problems where it is difficult to find a solution but relatively trivial to verify one.

Policymaking and Partnerships

Excitement over these new discoveries and their potential for revolutionizing information processing gave rise to plans for enhanced information sharing and policymaking and ultimately the prioritization and sequencing of national and international research efforts. The National Institute of Standards and Technology (NIST) and the Department of Defense (DoD) hosted the first U.S. government workshops on quantum computing in the mid-1990s. In 2000, theoretical physicist David DiVincenzo outlined the requirements necessary for constructing a quantum computer. These requirements are known as the DiVincenzo criteria and include such things as well-defined qubits, initialization to a pure state (complete knowledge of the system as opposed to indeterminacy or uncertainty), a universal set of quantum gates, qubit-specific measurement, and long coherence times. In 2002, an expert panel convened by Los Alamos National Laboratory released a Quantum Information Science and Technology Roadmap to capture the challenges involved in quantum computing, provide some direction on technical goals, and capture and characterize progress toward those goals through a variety of technologies and approaches. The panel decided to adopt the DiVincenzo criteria to evaluate the viability of various quantum computing approaches.

Evaluations of quantum computing models and approaches began yielding to instantiations in physical hardware and useful algorithms. In 1995, Christopher Monroe and David Wineland demonstrated the first quantum logic gate with trapped ions (the controlled-NOT)—an indispensable component for constructing gate-based quantum computers—publishing their results in Physical Review Letters. In 2005, researchers at the University of Michigan created a scalable and mass-producible semiconductor chip ion trap as a potential pathway to scalable quantum computing. In 2009, researchers at Yale University made the first solid-state, gate quantum processor. Two years later, D-Wave Systems of Burnaby, British Columbia, became the first company to market a commercial quantum computer. D-Wave's machine involves a unique approach to analog computing known as quantum annealing. Annealing processors are special-purpose technology; they are deployed against problems where the search space is discrete, with many local minima or plateaus, such as combinatorial optimization problems. It is not a universal quantum computer.

Yet, with the introduction of the original D-Wave, it became clear that fundamental advances in quantum hardware and software might yield extraordinary economic rewards and national security dividends. The research involved would be expensive and risky. A number of partnerships were forged in the early 2000s between private-sector companies and government agencies. Early buyers of D-Wave quantum computers included Google in alliance with NASA, Lockheed Martin Corporation in cooperation with the University of Southern California, and the U.S. Department of Energy's Los Alamos National Laboratory.

Google Research, NASA, and the Universities Space Research Association soon agreed that the value of quantum computers in solving intractable problems in computer science, and especially machine learning, was so great that they formally established a Quantum Artificial Intelligence Lab (QuAIL) at NASA's Ames Research Center in the Silicon Valley. NASA is interested in using hybrid quantum-classical technologies to attack some of the most difficult machine learning problems, such as generative unsupervised learning. IBM, Intel, and Rigetti are also chasing goals that would demonstrate quantum computational speedups over classical computers and algorithms in a variety of areas (sometimes termed quantum supremacy or quantum advantage). In 2017, University of Toronto assistant professor Peter Wittek founded the Quantum Stream in the Creative Destruction Lab (CDL). Despite Wittek's untimely death in a Himalayan avalanche, Quantum Stream continues to encourage scientists, entrepreneurs, and investors to pursue commercial opportunities in quantum computing and machine learning. Quantum Stream's technology partners include D-Wave Systems, IBM Q, Rigetti Computing, Xanadu, and Zapata Computing. Dozens of other startups and well-established companies are sprinting forward to create their own quantum computing technologies and applications, including the first quantum computing software company, 1QB Information Technologies (1QBit). In November 2021, IBM Quantum announced Eagle, a 127-qubit quantum processor. It is possible, however, that the leader in quantum computing is now the University of Science and Technology of China, which also in November 2021 claimed a 66-qubit superconducting quantum processor called Zuchongzhi and an even more powerful photonic quantum computer called Jiuzhang 2.0.

It is hard to know who has achieved primacy because verification and benchmarking of quantum computers remains a murky process and also because of the inherent diversity in current approaches and models of quantum computers. There is excitement surrounding a variety of models for manipulating a collection of qubits: gate model quantum computing, quantum annealing, adiabatic quantum computing (AQC), and topological quantum computing among them. There is also great diversity in methods for building physical implementations of quantum systems. Companies and research labs internationally are pursuing superconducting quantum computers, linear optical quantum computers, nitrogen-vacancy quantum computers, quantum computing with neutral atoms trapped in optical lattices, and a variety of other designs. More methods, approaches, and implementations may yet be undiscovered.

The physical implementation is important because quantum computers and qubits are devilishly difficult to control. Information stored in qubits can escape when the qubits become accidentally entangled with the outside environment, the measurement device and controls, or the material of the quantum computer itself. This seepage of quantum information is called decoherence. Qubits also need to be shielded physically from any kind of noise: changing magnetic and electrical fields, radiation from other electronic devices, cosmic rays from space, radiation from warm objects, and other rogue particles and waves. Making and manipulating high-quality qubits in quantum computers will require reducing decoherence and noise and also perhaps the invention of the sort of planning for fault-tolerance found in traditional computers. Quantum error correction is a multiply redundant scheme for spreading the information of one qubit and encoding it onto the highly entangled state of several other physical qubits. It is not known how many physical qubits will be needed to model a single logical qubit accessed by a quantum algorithm, but the number may be 100 to 10,000 times as high. Entangling, controlling, and measuring qubits have yet another major impediment familiar to generations of designers of classical computers: problems of scalability.

In 2018, President Donald Trump signed the National Quantum Initiative Act into law. The act is designed to plan, coordinate, and accelerate quantum research and development for economic and national security over a 10-year period. Funded under the National Quantum Initiative Act is the Quantum Economic Development Consortium™ (QED-C™), with NIST and SRI International as lead managers. Fundamental to the passage of the law is a shared recognition that quantum computing promises to contribute solutions to humanity's greatest and most difficult challenges in the areas of agriculture, biology, chemistry, climate and environment, communications, energy, healthcare, and materials science.

Quantum computer science is supported by a number of important online resources. The Quantum Algorithm Zoo, a comprehensive catalog of quantum algorithms, is managed by Stephen Jordan in Microsoft Research's Quantum Systems group. IBM hosts the Quantum Experience, an online interface to the company's superconducting quantum systems and a repository of quantum information processing protocols. Qiskit is a software development kit (SDK) that has been open sourced for anyone interested in working with OpenQASM (a programming language for describing universal physical quantum circuits) and IBM Q quantum processors. Google AI in collaboration with the University of Waterloo, the “moonshot factory” X, and Volkswagen announced TensorFlow Quantum (TFQ) in 2020; TFQ is a Python-based open source library and framework for hands-on quantum machine learning.

Quantum AI/ML

Quantum computing applications have already made headway in machine learning and AI, genomics and drug discovery, the chemical industry, molecular biology, cryptography, transportation and warehouse logistics, Internet communications, and simulation of quantum systems. Quantum simulation in particular could facilitate rapid prototyping of materials and designs, long before construction of parts or assemblies through CNC machining, injection molding, rapid tooling, or 3D printing. Currently, the top quantum computers are capable of simulating only a handful of particles and their interactions. But tantalizing clues are being found that may unravel the low-temperature behavior of exotic materials and superconductivity, help us understand the chemistry and production of environmentally friendly carbon-neutral fertilizers and cements, facilitate the design of next-gen EV batteries and solar panels, and model the complexities of flight mechanics, aerodynamics, and fluid dynamics in the aerospace industry.

Here are some of the more mind-blowing developments: Edward Snowden's 2014 leak of National Security Agency files confirmed the existence of the SIGINT initiatives “Penetrating Hard Targets” and “Owning the Net” to break any form of strong encryption, gain access to high-value secure digital communications networks, and design and attack Quantum Key Distribution (QKD) protocols. For these purposes, the agency planned to develop an $80 million quantum “god machine.” In 2015, Unai Alvarez-Rodriguez of the University of the Basque Country in Spain shared research called “Artificial Life in Quantum Technologies,” which he believes “paves the way for the realization of artificial life and embodied evolution with quantum technologies.” In 2019, researchers at Ulm University in Germany observed evidence of quantum Darwinism in a test of synthetic diamond at room temperature. Quantum Darwinism is a theory that explains how our world of objective, classical physics emerges from the vagaries of the quantum world. Quantum Darwinism asserts that the “quantum-classical transition” is similar to the process of evolutionary natural selection. Physical properties selected from a bouillabaisse of possibilities become concrete because they are the “fittest” survivors. This is why, for instance, separate individuals can measure a quantum system and ultimately reach agreement on their findings. In just the last year, scientists have (a) announced a proof of concept for remote-sensing quantum radar, (b) created an unhackable integrated quantum communication network linking nodes over a total distance of 2,850 miles, and (c) developed a proposal to target and test potential quantum communications sent by extraterrestrials using existing telescope and receiver equipment.

The convergence of quantum computing and artificial intelligence (called quantum AL/ML [QAI]) will dramatically alter information science and technology, economic activity and social paradigms, regulatory frameworks, and political and security arrangements. The Fourth Industrial Revolution of GRIN technologies—genetics, robotics, information, and nanotechnologies—promises to soon give way to what the Japanese call “Society 5.0” and the Dutch term “Smart Humanity.” The next revolutionary shift could produce a post-scarcity golden age where quantum AI for good holds sway and where advances in quantum technologies permit the universal democratization of access to limitless computational possibility. Or it could produce a postapocalyptic hellscape. Perhaps we are already living in that technological dystopia and should attempt to bolt for freedom.

Johannes Otterbach of the quantum-computer company Rigetti has remarked that quantum computing and machine learning are inherently probabilistic and thus natural bedfellows. Quantum computers could dramatically increase the speed of training in machine learning. Quantum machine learning will advance all three of the primary subcategories of ML: supervised learning, unsupervised learning, and reinforcement learning. Researchers are searching for quantum machine learning algorithms that demonstrate substantial speedups over classical algorithms and that overcome intractable exponential-time obstacles to problem solving and decision-making in the areas of sampling, search, optimization, pattern recognition, predictive- and risk-analytics, and simulation.

Quantum AI/ML Applications

One powerful example of the intersection of quantum computing and AI is the development of working quantum algorithms for route and traffic optimization. Essentially, these quantum applications compute the quickest route for each individual vehicle in a fleet and optimizes it in real time. Toyota Tsusho Corp—working in partnership with Microsoft and the quantum computing firm Jij—has demonstrated the potential of quantum routing algorithms to reduce the wait time at red lights by 20 percent. Volkswagen has also successfully tested quantum-computer enhanced navigation and route optimization applications on the CARRIS public bus fleet in Lisbon, Portugal, using a D-Wave machine. The goal of this test was to reduce traffic congestion and travel times during the Web-Summit technology conference. Volkswagen also developed a quantum simulation of the optimal routing of 10,000 taxis moving between the Beijing airport and the central business district 20 miles away using D-Wave technology. Such real-time quantum applications could also become useful in cross-functional supply-chain management and transportation logistics and in AI-powered autonomous cars and trucks.

Predictive and risk analytic QAI technology will aid in the forecasting, management, and disruption of hazards such as adverse geopolitical events or terror attacks, stock market crashes and financial panics, utility grid overloads, anthropogenic threats (climate change, habitat destruction, overexploitation of natural resources), social unrest, and future pandemics. Legal studies scholars are already examining the implications of a new field of “quantum jurisprudence” or “quantum AI law.” Some scholars are even pondering the paradoxical implications and casuistry of criminality, where disputes, violations, and breaches of contract simply evaporate in the process of asking of questions about it. Total information awareness and quantum legal simulations and decision-making will make predictive pre-delinquency and policing more muscular, and precrime fighting (in the academic rather than the science fictional sense) more likely. On the other hand, quantum AI could also make destabilizing Cambridge Analytica–style political manipulations or Equifax-like data breaches more quotidian occurrences.

Artificial intelligence algorithms are also helping to decipher the physics of quantum systems. For example, leading-edge quantum sensing technology is used to detect extremely small variations in microgravity using solid state or photonic quantum computing systems. The technology is expected to advance the state-of-the-art in seismology, geological prospecting, electromagnetic field sensing, global positioning systems, measurement, microscopy, advanced radar, atomic clocks, magnetometers, and ultra-sensitive gravimeters. Quantum sensors could provide precise warnings of seismic events like earthquakes and volcanic eruptions, tsunamis, and silent-running enemy naval submarines.

Medical imaging technologies already involve serious use of computerized expert systems and complex pattern recognition software. There is a classic (manually applied) heuristic approach to melanoma diagnosis called ABCDE (asymmetry, border irregular, color distribution, diameter large, evolving mole). A convolutional neural network using machine learning has been trained on millions of images to apply ABCDE in the identification of skin lesions, melanomas, rashes, and other abnormalities. Another supervised learning algorithm called CheXNet outperforms expert radiologists at pneumonia screening and diagnosis. QAI in imaging, or quantum radiomics, promises to take these interpretive efforts to the next level. Quantum artificial neural networks may not spell the death knell of radiology—a long-overdue prediction—but could make the specialty more cost-effective and efficient by reading in microseconds the exponentially growing numbers of medical scans taken around the world—effectively pre-analyzing images, flagging ambiguous features, and helping humans avoid common errors attributed to boredom, inattention, and fatigue. Quantum radiomics could also attack the complex, real-time optimization problems of weighing and assessing the thousands of variables that contribute to the making of flexible and effective radiotherapy treatment plans for cancer patients.

Quantum Ultra-intelligence

The promise and perils of quantum artificial intelligence are anticipated in an emerging literary subgenre called quantum fiction. Books that feature realistic or fanciful forms of QAI include Factoring Humanity (1998) and Quantum Night (2016) by Robert J. Sawyer, Ghostwritten (1999) by David Mitchell, 2312 (2012) by Kim Stanley Robinson, and Antediluvian (2019) by Wil McCarthy. The Japanese cyberpunk manga series Battle Angel Alita: Last Order (2000–2014) and Kiddy Grade (2002) are populated by several mysterious quantum AIs. Films include Transformers (2007) where a robot's “signal pattern is learning” using “quantum mechanics” and Transcendence (2014) with Johnny Depp. The Hulu TV miniseries Devs (2020) depicts a fictional quantum computing company and comments on the many-worlds interpretation of quantum mechanics and the effects of quantum technologies on determinism and free will. In the HBO television series Westworld (2016–present), a quantum artificial intelligence system named Rehoboam engineers and directs real-world society using its copious database. Many fictional quantum computers that have attained consciousness, in particular the laboring synthetic “geths” who are in conflict with their extraterrestrial humanoid masters the “quarians,” populate the universe of the acclaimed military sf video game franchise Mass Effect (2007–present).

While these examples are all fiction, back here in reality some computer scientists have given their lives and careers over to engineering an artificial general intelligence (AGI) that possesses self-awareness, even to the point where it bootstraps itself to ultra-intelligence and unlocks the Technological Singularity. An emerging ultra-intelligence may enjoy a running start, as it will have instant access to the Penrose-Hameroff theory of quantum consciousness and neuro-inspired computer chips to create the blueprints for its self-designed quantum neural networks. It is unclear whether we will be able to maintain “human-in-the-loop” control over a self-aware QAI or cajole it into a superhero partnership with humanity (so-called collaborative QAI). Several experts responding to a 2021 Pew Research question on ethical AI doubted that a QAI would participate in curbing its own limits through something like a Quantum AI Constitutional Convention or Magna Carta for the Quantum Age.

Human beings are the only creatures on Earth with an almost unlimited capacity to learn, improve, and invent. Humans are very good at adding new complexity to their habitats. Indeed, the objective of so many who work in robotics, automation, and artificial intelligence today is not to restore human habitat to a blissful natural state but rather to create more and more of the fabricated world that we seemingly cannot do without. This impulse shapes the mutual goals of quantum computing and artificial intelligence and will have life-altering consequences. As the MIT physicist and ML specialist Max Tegmark has said, “Everything we love about civilization is a product of intelligence, so amplifying our human intelligence with artificial intelligence has the potential of helping civilization flourish like never before—as long as we manage to keep the technology beneficial.”

Note

  1. *   Quadratic speedups are so-called second-degree polynomial time speedups. They involve the square exponent of a variable or unknown quantity and no higher power.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset