CHAPTER 11

The Future

In a previous chapter, I recounted having attended an event in Seattle put on by the United States Chamber of Commerce. It was jointly presented by the Institute for Legal Reform and the Center for Emerging Technologies.

One of the speakers said something like “Big Data is the new oil.” I made a connection then with AI in the form of deep learning, and I’m repeating it here:

“If Big Data is the new oil, deep learning is the new refinery.”

Here’s my overview of the ingredients of the New Refinery as follows, asserting that they consisted of the Computer Processing Units (CPUs) + Graphics Processing Units (GPUs) + Deep Learning algorithms (DL) + Applications revealing useful insights (A); that is, CPUs + GPUs + DLs + As.

I also see two other revolutions on the horizon and would like to give you an “early warning” about them.

On the hardware side, I think we must take note of the efforts to develop better hardware, the so-called Quantum Computers.

Quantum Computing (QC) involves a computer that operates in an entirely different way from the computers with which we’re familiar. In the world we experience as humans, the physics is a classical approximation of Nature, which is quantum mechanical.

The computers we use today are digital and binary, meaning that they operate in two specific states of either one (1) or zero (0), as if there were only two choices, e.g., “true” or “false” or “yes” or “no.” The data encoded in this way are called “bits.”

QC uses a quantum processing unit (QPC) and a “superposition” of states called “quantum bits” or “qubits” for short. I won’t delve into the development of a QPC except to alert you to the fact that the idea was first proposed in 1980 and has an associated timeline.1

In 1981, the late Caltech professor, Richard Feynman, is reported to have said: “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”2

Quantum Computers have been under consideration for nearly 40 years, but the pace of innovation has recently accelerated. In March 2017, IBM announced a QC system with an open Application Programming Interface (API) called (not surprisingly) IBM Q. In less than two years, on ­January 8, 2019, IBM announced IBM Q System One as the first integrated quantum system for commercial use.3

In December 2017, Microsoft announced a preview version of a developer kit with a programming language called Q#.4 This language is for writing programs that run on an emulated quantum computer. At its Build conference in May 2019, Microsoft announced it would, during the summer of 2019, open-source parts of its Quantum Developer Kit on GitHub, including the Q# compiler and quantum simulators.5

Google’s in the hunt, too. In March 2018, Google’s Quantum AI Lab announced a 72 qubit processor called Bristlecone.6 And on July 19, 2018, Google announced an open-source framework called Cirq and plans for a Bristlecone cloud.7 More recently, on February 21, 2019, Google announced a cryogenic controller that uses only two miliwatts of power.8

But with respect to hardware, has anyone seen the light? Yes. There are three startups—Luminous, Lightelligence, and Lightmatter—who are developing computing chips which, instead of using electrons, are ­powered by light.9 Such devices are known as optical processors.

Why are startups working on them? Simple: optical processors using lasers and “waveguides” may be a faster, better way for computers to carry out the ever-increasing number of mathematical calculations that some (but not all) AI applications demands.

And, speaking of demand, recent AI advances are requiring, and getting, an amount of compute power which exhibits a doubling time of just under four months.10

So the next question is whether the software that would run on quantum computer is being developed. And the answer is yes, that’s beginning to appear too.

In January of 2018, a paper by Dernbach, Mohseni-Kabir, Towsley, and Pal was published called Quantum Walk Inspired Neural Networks for Graph-Structured Data. We now have yet another abbreviation: QWNNs.11

The first author is Stefan Dernbach. At that time, he was a PhD student at the Computer Networks Research Group at the University of ­Massachusetts College of Information and Computer Sciences. There are three co-authors: Arman Mohseni-Kabir, Don Towsley, and Siddarth Pal. Towsley was Dernbach’s PhD advisor. Mohseni-Kabir was a graduate ­student in the physics department at UMass Amherst. Pal was a scientist with BBN Raytheon Technologies.

The abstract reads in part:

We propose quantum walk neural networks (QWNN), a new graph neural network architecture based on quantum random walks, the quantum parallel to classical random walks. A QWNN learns a quantum walk on a graph to construct a diffusion operator which can be applied to a signal on a graph. We demonstrate the use of the network for prediction tasks for graph structured signals.

Note the phrase, “prediction tasks.” That’s what’s so promising. Like it or not, we want, and need, AI to help us do in the future what we humans cannot now do. I think we’ll be hearing a lot more about ­Quantum ­Computers and some variation of QWNN.

Notes

1 Wikipedia (2019).

2 Gil (2016).

3 System One IBM Q (2019).

4 Microsoft (2019).

5 Nguyen (2019).

6 Whitwam (2018).

7 Nott (2018).

8 Wiggers (2019).

9 Giles (2019).

10 Amodei and Hernandez (2018).

11 Dernbach, Mohseni-Kabir, Pal, Towsley, and Gepner (2018).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.217.248.216