Quantum computing

Many leading cloud and technology vendors are researching and developing quantum computers that operate by manipulating subatomic particles. These vendors are driven by a vision that quantum computing devices could be millions of times faster than today's computers. Some of the first versions of quantum computers with more limited power were being advertised as this book was published.

Quantum computers are fundamentally different from mainstream binary computers that have existed from the beginning of the electronic computer age. In binary computations, bits can be in one of two states, off or on (represented as 0s or 1s). This basic concept drove the inner workings of the first electronic computers that relied on vacuum tubes, later computers that relied on transistors and, still later, computers relying on integrated circuits.

The quantum computation theory was defined in the 1980s in works published by Richard Feynman, David Albert, and David Deutsch. The additional computational power stems from the premise that quantum bits (qubits) can exist in additional states simultaneously through what is called superposition. Early efforts in creating superposition focused on manipulating photons or electrons using technologies such as ion traps, optical traps, quantum dots (a single electron trapped by multiple atoms), semiconductor impurities, and superconducting circuits.

Dealing with qubits in superposition is extremely tricky. If you try to look at a qubit in superposition, its value will appear to be 0 or 1 (so, it will appear to be binary). A technique called entanglement is used to detect the spin of the first atom by placing a second atom next to it. The second atom will assume the opposite spin of the first. Through superposition, two qubits can perform four calculations.

Much of the work over the past few years has been focused on increasing the number of qubits that can be manipulated, minimizing errors in detection, maintaining superposition, and moving qubits over small distances. Two early types of quantum computers emerged: adiabatic and quantum gate array computers.

Adiabatic quantum computers use a concept called annealing, usually associated with heating and then cooling metals in metalworking. Qubits can be stressed and then relaxed into patterns, making these computers useful in pattern matching types of problems. A company named D-Wave produced early quantum computers that relied on QPUs (quantum processing units) built using Niobium metal and superconductors that operated at 9.2 degrees Kelvin (-263.95 degrees Celsius).

Quantum gate array computers contain registers that hold numbers and support common mathematical functions. They are different from traditional computers, in that registers represent all numbers and, when multiplying, qubits register together. Every possible product is held in the register.

In 2016 and 2017, apparent breakthroughs in the creation of photonic chips and quantum computing were publicized. In a photonic chip, a laser fires light pulses into a micro-ring resonator etched in silica. The resonator emits tangled pairs of photons called qudits. Two qudits, each having a dimension of 10 different states, were entangled and performed 100 different calculations. The qudits were then sent through optical fiber to prove that entanglement could be preserved over distances. So, not only was this development potentially able to produce computers much more powerful than previous quantum-computing efforts, but it also showed promise to be more commercially viable.

It is currently not believed that quantum computers will solve all problems faster than conventional computers, given their unique way of operating. However, quantum algorithms do exist for factorizing large numbers (Shor's algorithm) and searching unsorted databases (Grover's algorithm). It is believed that quantum computers will eventually solve today's public key / private key combinations used in cryptography in minutes or seconds. However, they will also be useful in generating next-generation quantum keys that can't be copied without destroying them. They will provide much greater performance in machine learning, anomaly detection, and data-sampling problems.

DNA logic for faster processing and dense storage

A branch of nanoscale research with the goals of providing faster computing and increased storage capacity involves the manipulation of synthetic DNA molecules. Nanoscale computational circuits can be created using synthetic DNA and organized using a technique called DNA origami. Synthetic DNA is also being tested for usage in the encoding and decoding of stored data. A single gram of DNA can store almost a zettabyte (one trillion gigabytes) of data, and it is believed that storage of data in this format can exist for thousands of years. Wide availability of DNA-based computing and storage devices is thought to be many years in the future.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.226.187.24