Glossary

Artificial intelligence (AI): 
See Appendix B for definition and discussion.
Bias: 
A problem that happens when an algorithm produces prejudiced results due to faulty assumptions in the machine learning process.
Complex systems: 
A complex system is a system composed of many components and subcomponents that interact with each other and whose behavior is intrinsically difficult to model due to the dependencies, competitions, relationships, or great numbers of interactions between their parts or between the system and its surroundings. Examples of complex systems include the human brain, biological organisms, global climate, and infrastructure, such as the power grid, transportation or communication systems, complex software and electronic systems, and social and economic organizations.
Complexity theory: 
Complexity theory is the study of complex systems. While it is a relatively new field of study, it covers a wide range of disciplines in the physical, biological, and social sciences.
Decoherence: 
Quantum decoherence is the loss of quantum coherence. As long as there exists a definite phase relation between different states, the system is said to be coherent.
Edge case: 
In software engineering, an edge case is a problem or situation caused by a parameter exceeding the bounds the system was designed to accept. In other words, edge cases occur only at extreme operating parameters.
Encryption: 
Encryption transforms data to lock information using an algorithm. A password or “key” is used to unlock the data and converts the information to make the original information readable.
Entanglement: 
This is a phenomenon that occurs when a group of particles are generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the group cannot be described independently of the state of the others, even when the particles are separated by a large distance.
Error rate: 
Current quantum computers typically have error rates near one in a thousand (103), but many practical applications call for error rates as low as one in a quadrillion (1015). See also fault tolerance.
Fault tolerance: 
The nature of quantum computers means that they will not be able to perform gate operations perfectly—some error is unavoidable. The fault tolerance of a quantum computer reflects its ability to protect quantum information from such errors (due to decoherence and other quantum noise). However, although Noisy Intermediate-Scale Quantum (NISQ) computers are realizable in the near-term, fully fault-tolerant quantum computing is not likely to happen for some time because of the large number of physical qubits needed.
Game theory: 
This is the study of strategies, examining what is the best choice given multiple (or even infinite) choices in one or multiple interactions with one or more players.
Grover's algorithm: 
Grover's algorithm is a quantum algorithm used for searching an unsorted database. It was invented by Lov Grover in 1996.
Hilbert space: 
Hilbert space, in mathematics, allows generalizing the methods of linear algebra and calculus from the two-dimensional and three-dimensional Euclidean spaces to spaces that may have an infinite dimension.
Indeterminacy: 
A principle in quantum mechanics stating that it is impossible to accurately measure both the position and the momentum of very small particles at the same time.
Machine learning: 
See Appendix B for definition and discussion.
Monte Carlo simulation: 
Monte Carlo simulations are algorithms that use repeated random sampling to obtain numerical results. The main concept is to use randomness to solve problems that might not be random or deterministic.
Moore's law: 
This is a technology trend first observed by Gordon Moore, who noticed that transistor-based computers appear to double processor speed roughly every two years.
Neural network: 
A neural network (also known as an artificial neural network [ANN] or simulated neural network [SNN]) is a series of algorithms that endeavors to recognize underlying relationships in a dataset through a process that imitates how the human brain functions.
Noise: 
Quantum noise refers to the fluctuations of signal, that is, noise arising from quantum fluctuations.
Noisy intermediate-scale quantum (NISQ): 
A term first used by John Preskill in 2018, noisy intermediate-scale quantum processors contain about 50 to a few hundred qubits. These processors are not sophisticated enough to achieve robust fault tolerance.
Quantum advantage: 
Quantum advantage refers to the achievement of processing a real-world problem faster on a quantum computer than on a classical computer. This is sometimes also called quantum supremacy.
Quantum bit: 
A quantum bit is the basic unit of information in quantum computing, the quantum equivalent of a classical binary bit. Just like classical bits, a quantum bit must have two states: 0 and 1. Unlike a classical bit, a quantum bit can also exist in superposition states, be subjected to incompatible measurements, and even be entangled with other quantum bits. Being able to use superposition, quantum interference, and entanglement makes qubits very different and much more powerful than classical bits. There are several kinds of qubits, including spin, trapped atoms and ions, photons, and superconducting circuits. Physical qubits in a computer refer to the number of qubits in the quantum computer. Logical qubits are groups of physical qubits used as a single qubit in processor operations.
Quantum computer: 
See Appendix A for definition and discussion.
Quantum interference: 
Quantum interference states that particles not only can be in more than one place at the same time (through superposition) but that a single particle, i.e., a photon (light particles), can cross its own trajectory and interfere with the direction of its own path. In other words, the wave function interferes with itself.
Quantum supremacy: 
See quantum advantage.
Qubit: 
See quantum bit.
Shor's Algorithm: 
In 1995, Peter Shor proposed a polynomial-time quantum algorithm for factoring a useful, real-life problem. Shor's algorithm was the first nontrivial quantum algorithm showing a potential of “exponential” speedup over classical algorithms.
Superposition: 
Superposition is the ability of a quantum system to be in several states at the same time until it is observed or measured.
Turing test: 
The Turing test was originally conceived by Alan Turing in 1950. The test evaluates a machine's ability to demonstrate intelligent behavior indistinguishable from a real person. If an evaluator cannot tell the difference between the machine and a real person, the machine is said to have passed the Turing test.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset