If you think you understand quantum mechanics, you don’t understand quantum mechanics.

Richard P. Feynman, jointly awarded the Nobel Prize in Physics 1965 for “fundamental work in quantum electrodynamics, with deep-ploughing consequences for the physics of elementary particles”

Chapter 1
Hello Quantum

Quantum computers inhabit the microscopic world of neutrinos and mesons and muons and electrons buzzing about protons and neutrons—a smorsgasbord of subatomic particles bringing with them a cornucopia of strange concepts. Literally from the tiniest aspects of computing to the way we design algorithms, quantum computing introduces a new paradigm for programming computers for mainstream applications that demand heavy number crunching, such as:

  • Optimizing scanning of magnetic resonance images (MRI) in radiology.[3]
  • Understanding complex molecular structures for building life-saving drugs.[4]
  • Hyper-large-scale logistics and transportation-routing problems.[5]
  • Auto companies[6] are betting that quantum computers will help build better batteries, route autonomous vehicles (self-driving cars), and optimize assembly lines.

The promise that quantum computing is blazing a new way to solve super-hard problems is palpable. In 2017 and 2018, at least $450 million[7] was poured into quantum technology companies, more than four times the amount of the previous two years. Google announced [8] the results of a quantum program that blasted through calculations to produce certifiable random numbers in three minutes, twenty seconds, a task which they estimate would have taken 100,000 classical computers running the fastest known algorithms roughly 10,000 years to complete. Google expects that this capability can be used in optimization applications, machine learning, and designing new materials, among others, and is currently planning on demonstrating cryptographic protocols.[9] But the true significance of this milestone is proving that quantum effects can be controlled and programmatically introduced at sufficient scales to perform computations even though it may not have any immediate utility. (You’ll understand how Google made these computations in How Did Google Show Superiority of Quantum Computers over Classical Computers?.)

Governments around the world, too, are catching on to the power of quantum computers:

  • In December 2018, the United States Congress unanimously passed the National Quantum Initiative Act,[10] which has been signed into law. This law is a ten-year commitment by the United States to “accelerate the development of quantum information science and technology applications” through partnerships with universities, startups, and corporations. Further, the United States (and China) considers quantum computing a national security priority.

  • China is investing $400 million to build the world’s largest quantum research center, the National Laboratory for Quantum Information Science, which they claim will have the calculating power of “one million times all existing computers around the world combined.”[11]

  • India[12] is pouring $1.12 billion over five years into quantum technologies.

  • The European Union, Australia, Japan, Switzerland, and several others[13] are investing in quantum computing.

  • Russia[14] will plow $790 million over the next five years into “basic and applied quantum research.”

As this technology, which The New York Times calls the “jazziest and most mysterious concepts in modern science,”[15] ramps up, it’s a good time to clock in.

But quantum computing can be hard to learn. Even a program with a few lines can stymie experienced programmers. For example, look at these lines:

 u3(pi/3,pi/2,-pi/2) q[0];
 u3(pi/3,pi/4,-pi/2) q[0];

These statements look familiar, if somewhat puzzling. Although these lines look vague, they trigger quantum effects in the program. So the challenge with quantum programming stems not from mastering a new syntax but from grasping a whole different set of concepts than those in classical computing.

So it’s fair to ask, Why are quantum phenomena important? How do they help solve hard computational problems? And, is it worth spending time and effort to learn this technology today?

In this book, I intend to show you how and why quantum mechanics offers a promising alternative to classical computing for hard problems and that it’s not as hard to work with as you may think. To convince you that the technology is already upon us, you’ll be learning quantum computing by running your programs on a real quantum computer.

But, to master quantum computing, you’ll need to come up to speed with new ways of thinking of about traditional computational tasks. So, in this chapter, we’ll review a scheduling problem that’s typical of the kinds where quantum computing can make a big difference. We’ll work with a variant whose solution is easy to verify yet allows you to see real quantum phenomena in action. You’ll also get a chance to try your hand at an actual quantum program.

In the rest of the chapters, I’ll introduce you to quantum phenomena useful to computing and how to trigger them in your programs. Each chapter builds on ideas described in the earlier ones. Learning different quantum effects can feel like isolated piecemeal techniques that leave you wondering about their utility. So as you learn new quantum concepts, we’ll use the scheduling problem to bring these topics together in a meaningful way.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.128.94.171