THE INTERNET HAS developed through several periods: The first (early and mid–1990s) was the creation of the website, where the Internet was a platform for the presentation of content. As foreshadowed in The Digital Economy, by the late 1990s it had become a transactional platform for e-commerce. Then 2002 brought the crash of the dot-coms and “the dark period”—where business executives, investors, and the media (and readers of books) wanted nothing to do with the Internet. All this changed around 2005 with the rise of the mobile Internet, brought to us first by companies such as Qualcomm and BlackBerry and then by Apple and later Samsung, as hundreds of millions of new people joined the Web. Toward the end of the decade, social media took the world by storm, as the warm-up acts like MySpace were eclipsed by Facebook, Twitter, and other platforms.
Today there are six big themes of immediate relevance to most organizations:
1. The Social Web. These are websites and software that enable people to connect and interact. Facebook alone is approaching 2 billion members. The flipside of exploding social media is the exploding loss of privacy, as people disclose intimate details of their lives for the entire world to see.
2. The Mobile Web. The time spent on mobile devices such as smartphones and tablets continues to grow. The rate of growth of Internet users and smartphone users is growing less quickly than in previous years, but in 2014 mobile data traffic exploded by 81% with accelerating growth, driven in part by video. Laptop activities are moving to tablets, smartphones, and other mobile devices like watches.
3. Geospaciality. In the old Web we surfed websites. In the new Web, we surf physical reality with sound, video, graphics, or GPS provided by technology such as Google Glass. Augmented reality is usually done in real time, with the information about an object or place appearing on the screen while the user is there.
4. The Internet of Things. Soon mobile computing devices, broadband access, wireless networks, and computing power embedded in everything such as toasters, lightbulbs, bicycles, and factory tools will converge into a vast global network that will fuel exponential change in business model innovation.
5. The Cloud. The old Web was a platform for presentation of content, with a focus on websites, eyeballs, clicks, stickiness, and the like. Think of today’s Internet as a computational platform—a global computer. Every time you do something—upload a photo, remix some music, or comment in a discussion—you are programming this computer. Human activity is now building a machine and creating value on this massive platform of computing power or data storage. Companies can move their internal information technology (IT) capabilities onto the Cloud and in doing so move from fixed to variable and much lower costs, better integration, instant access to more capability and, the best part—the world becomes your software development department. IT companies, rather than selling software, are changing to offer Software as a Service (SaaS), so users can log into a service or program remotely without having to install an application on their own computer. Amazon, Google, and Oracle Cloud are examples.
6. Big Data and Analytics. The massive amounts of data that once overwhelmed traditional data-processing applications have become a new asset. This data is produced by almost every facet of society, including billions of devices and sensors, hugely popular social media, and business processes such as retail and wholesale sales. An example would be Walmart analyzing all items purchased during a month to determine the buying patterns of its shoppers. The real-time costing of airplane tickets today (to achieve full flights) will become the real-time costing of bananas tomorrow, as computers ensure that demand and supply are perfectly tuned. A company will blend its proprietary data with public data from the Internet and social media. Huge data sets are also common in fields such as meteorology and genomics. Big Data is so important, and changing so fast, it’s worth a more in-depth review.
The practice of analytics is rapidly evolving. It used to be select analysts manipulated opaque spreadsheets based on data whose accuracy and timeliness were suspect. It has now become a broader ecosystem, where all team members (including participants from beyond the organization’s walls) actively participate in collecting and acting upon information.
Today’s volatile business environment rewards those who can innovate the most quickly, and understand and respond to changing consumer sentiment. The staggering amount of data being created is a daunting challenge. Research suggests that 2.7 zettabytes (the amount of data in 50 million Libraries of Congress) of corporate information was created in 2012, an increase of more than 48% from 2011. One thousand zettabytes is a yottabyte, which is the estimated amount of data U.S. government agencies have gathered on people inside and outside the country. This is the equivalent of 250 trillion DVDs. Dramatic reductions in storage costs means for the first time that the archiving of data is now more expensive than just leaving it where it is.
New analytics tools and capabilities are appearing in the marketplace. Leading companies and technology providers are rethinking the fundamental model of analytics, and the contours of a new paradigm are emerging. The new generation of analytics goes beyond Big Data and the traditional narrow approach of analytics that was restricted to analyzing customer and financial data collected from their interactions on social media. Today companies are embracing the social revolution, using real-time technologies to unlock deep insights about customers and others, and to enable better-informed decisions and richer collaboration in real time.
To take advantage of this data, firms need to deploy collective intelligence and build on information from many sources, far broader than simply historical sales and financial information. There are five main characteristics of the new analytics:
1. Data is collected socially. Innovative companies collect, organize, and exploit structured and unstructured data from as many sources as possible from inside and outside the company. The most interesting data is sourced from myriad social interactions of customers and prospects.
2. Data is analyzed socially. Rather than specialists accessing information as individuals, analytical tools penetrate through an enterprise and data is analyzed in a collaborative fashion. As information becomes more trusted, transparent, and dispersed, it enables the true collective intelligence of enterprises and their business ecosystems. With more voices comes more creative and holistic feedback. This builds a broader picture that results in better-informed decisions.
3. Data and tools become mobile. The personal computer used to be the main device to support analytics, but as technology improves and corporate information becomes more democratic, it is being replaced with apps running on powerful mobile devices such as smartphones and tablets equipped with intuitive apps and data-collecting capabilities, including video.
4. Data is more visual. Powerful new applications, as well as enterprise software suites, illustrate data through graphs, visuals, and interactive simulations. Such tools explain the data more fully, and enable better collaborative analysis and more innovative thinking, especially for “visual thinkers.”
5. Data is more current, broadly available, and actionable. Just as work is conducted in real time, data can be collected, analyzed, and distributed faster than ever before—in real time. The implication of this phenomenon is that data is more accurate and complete, and more useful to support innovative real-time decision making and predictive modeling. When your best customers enter your virtual or physical outlet, they will receive the red carpet treatment.
Used properly, the new analytics empower individual workers and other contributors, and drive value in all areas of the organization. To extract maximum benefits, companies need to plan carefully, invest in the right infrastructure, and constantly iterate and innovate in a collaborative manner. At the same time, wider access to data increases risk, and risk management policy needs to be codified and enforced, and as a result of the financial crisis of 2008 its underlying modeling assumptions must be better understood.
It wouldn’t be wise for me to predict the next 20 years because advances in technology will come at an ever-increasing pace. The digital revolution has moved on to “the second half of the chessboard,” a clever phrase coined by the American inventor and author Ray Kurzweil. He tells an old fable of a gullible king being so delighted with the game of chess that he offered the game’s inventor any reward he desired. The inventor asked for rice.
“I would like one grain of rice on the first square of the chessboard, two grains of rice on the second square, four grains of rice on the third square, and so on, all the way to the last square,” he said. Thinking this would add up to a couple of bags of rice, the king happily agreed.
He was misguided. While small at the outset, the amount of rice escalates to more than 2 billion grains halfway through the chessboard. The final square would require 9 billion billion grains of rice—enough to cover all of Earth.
After decades of doubling and redoubling, we’re achieving gargantuan leaps in all facets of information technologies, such as processing power, storage capacity, and bandwidth. Examples are everywhere, from Intel’s computer chips to low-cost consumer electronics. The first microprocessor Intel built in 1971 contained 2,300 transistors. About every 18 months since then a new chip was developed with double the number of transistors. Now chips have billions. Massive parallelism of cheap processors is now driving everything from Google to IBM’s Watson.
Computer chips are poised to take another huge leap in power with quantum computing. In place of today’s transistors in silicon-based conventional computers, tomorrow’s computing devices will use quantum bits, or qubits. Instead of storing information as 0s or 1s, as today’s computers do, a quantum computer’s qubits can be either a 0 or a 1 or both at the same time. A quantum computer can process a vast number of calculations simultaneously. The result will be computing devices many millions of times more powerful than today’s supercomputers. With enough data, a quantum computer would have the processing muscle to accurately predict the weather around the world for a two-week period in real time.
Microsoft established a significant and well-funded quantum computing research effort in 2006, and since then other companies such as IBM and Northrop Grumman have begun their own programs. So has the U.S. government.
IBM is also exploring other technologies using a neurosynaptic computing chip. The chip is designed to function as a human brain. It is also looking at silicon photonics, which would use particles of light to replace the electrons on a conventional chip.
Data storage is another area where costs continue to plunge at an ever-quickening rate. Look to a future where multi-terabyte USB flash drives will sell for a dollar or two, and the cost of storing data will be so low that we won’t bother to figure out what to keep and what to erase. We’ll just keep it all. And new storage devices using tungsten will be virtually indestructible.
This has big implications when everything you do can be constantly monitored. All of the data collected about your targeted advertisements will move from being “dumb” to “smart” and from ads to orders. Let’s say you surf the Internet looking for a new stereo and you order one online. Today’s dumb technology will continue to inundate you with stereo advertisements for weeks. Close, but they are missing the mark. In the future, they won’t. For example, say you are driving on Friday afternoon toward your cabin. The local general store will confirm by text that your standing order of milk, bacon, eggs, hamburger, and chicken will be ready for pickup. The size of the order will be increased as a result of the weight (of the passengers) in the back seat. Further, the Internet will know that you have forgotten to bring your bathing suit and that it is not in the vehicle and that the forecast is for hot weather. So the Internet will suggest that the general store add a bathing suit to your purchase. Knowing (from their smartphones) who is in the car may change the order because a passenger is glucose-intolerant. This, of course, is all happening as the Internet turns off your air conditioning at home and sets your alarm system.
When you walk down a store aisle, the company’s inventory, your tastes, and psychological profile (including impulsiveness) may reduce the price for items that you have previously glanced at five times (camera and computers will track your eye movements) but you have been ambivalent about purchasing. The instant price reduction convinces you that it is time to put the item in your cart. Trying to outfox the store’s computer and get discounts for items will be part of the shopping experience.
Not sure whether the store promised you an extended warranty for your dishwasher at no extra charge? Just call up the conversation you had with the salesperson when you bought the dishwasher. You know you recorded the whole transaction with your Google Glass, because you record everything you do throughout the day. However, revisiting your history over a disagreement with your spouse may be a high-risk undertaking.
This nonstop monitoring won’t just happen in stores. When you go out in public, what you are watching, the mood you are in, the speed at which you are walking, and many other variables will be tracked and available for analysis. The early stages of such a world are already in place. In New York City, the police department has installed a new Domain Awareness System (DAS) developed by Microsoft. It links 3,500 cameras, 2,600 radiation detectors, and dozens of license plate readers in fixed locations and mounted on cars. When a 911 call comes in about a suspicious package, the police can immediately review past footage from cameras in the area to see who put the package there. If the police enter a car’s license plate number, the system is able to tell police everywhere the car has been during the last four weeks.
The Brookings Institution released a paper titled “Recording Everything: Digital Storage as an Enabler of Authoritarian Governments.” It is now feasible to record everything that is said and done by a country’s citizens. It costs pennies to store the audio from a year’s worth of a person’s phone calls.
Many amazing “advances” are becoming technologically possible. Some of these advances may delight us, and some we may dread.
There is lots of controversy but I’ve concluded that few new digital technologies are as important as digital currencies such as Bitcoin. They have quickly gone from fringe curiosity to front-page news. Investor and Web browser pioneer Marc Andreessen told the Washington Post that “when we’re sitting here in 20 years, we’ll be talking about Bitcoin the way we talk about the Internet today. We just need time for it to play out.”
Digital currency advocates point to the rapid adoption of Bitcoin, now accepted by thousands of businesses including Virgin Galactic, Tesla, WordPress, and Overstock, as clear evidence of a changing tide. Critics paint them as a speculative toy with little or no intrinsic value. To them, digital currencies are a tool used by cyber-criminals to launder money and avoid taxes. Lack of oversight and regulation—and the prevalence of fraud and hacking—make digital currencies volatile, opaque, and untrustworthy.
But digital currencies offer tremendous ease of use—no permanent address, bank account, or identification is needed. Simply owning a cell phone empowers billions of people living in poverty to take control of their limited finances and enter into transactions without a bank account or credit card.
Digital currencies have low to no transaction costs. As an example, Bitcoin has no enforced fees, unlike traditional payment systems such as credit cards, debit cards, and PayPal. This has caught the attention of financial companies, such as banks and credit card companies. They wonder if service fees will evaporate as quickly as long distance revenues did for the telcos when Skype appeared.
Yes, digital currencies involve risks. Lack of trust is at the heart of the risk to users of digital currencies and to the currencies’ long-term viability. They can also be volatile. But I believe they can also serve as the foundation for an explosion of innovation and commerce. The current Internet is good for social collaboration and access to information. But it lacks many of the key capabilities needed for rich, trusted commerce. Imagine a new global platform where identity and trust are assured, where fraud is virtually impossible, and where appropriate payments are always made. To me this is the hidden promise of digital currencies.
The technology underlying Bitcoin offers more promise still. Andreas Antonopoulos is chief security officer at UK–based Blockchain.info, the world’s largest Bitcoin wallet provider with more than 1 million registered users. He told the UK’s The Telegraph that Bitcoin’s digital foundation is one of the most important inventions of the twenty-first century.
People think Bitcoin is just a better way to do PayPal, and it’s not. Just like the Internet, it’s a platform, and on that platform you can now build an incredible variety of things. We can’t even imagine what things people are going to build. But just in the last year, from watching the start-ups in the space, I’ve been amazed at the range of innovation that occurs when you combine Internet, the sharing economy, and crypto-currencies.1
We could see totally open services out of the control of any organization. Already a peer-to-peer Twitter-like messaging service aims to offer the same as Twitter but without a central point of control. Bitmessage wants to do the same with e-mail. Law enforcement agencies’ subpoenas would be rendered impotent, since they wouldn’t know to whom the subpoenas should be served.
Collecting personal information about your whereabouts, spending habits, tastes, and finances will become more invasive and more valuable. This information will help companies to more effectively target you and price products such as life and car insurance tailored to your circumstances and risk. Protecting what’s left of your privacy will be a constant cat-and-mouse game.
It’s hard to project future developments in digital technologies because increasingly they overlap with developments in other areas of science and human development. None is more significant than the concept of biotechnology (biotech)—the modifying of living organisms to meet our purposes. This involves breeding animals to meet human purposes, going back to domestication of animals and the cultivation of plants. But new technology involves multidisciplinary areas such as the environment, chemistry, microbiology, and genetics, which enable genetic engineering as well as cell and tissue culture technologies.
A lot of biotechnology research is focused on boosting the world’s food supply. Since 1850, the global human population has grown sevenfold, from 1 billion to 7 billion. That’s a lot of mouths to feed, and there will be many more mouths to feed in the years to come. A lot of food companies believe that the safest and most cost-effective way is through genetically modified organisms, or GMOs. This refers to plants or animals created by the gene-splicing techniques of biotechnology.
But the technology is controversial. Widely adopted in the United States, Brazil, and Argentina for the production of corn, soybeans, and cotton, GMOs are essentially banned in Europe and tightly regulated elsewhere. Opposition to GMOs can make some farmers seethe, since they believe using technology to boost crop yields is the only way to economically produce enough nutritious food to feed the planet.
Henry I. Miller, a physician, molecular biologist, and founding director of the U.S. Food and Drug Administration’s (FDA’s) Office of Biotechnology, wrote recently in the Wall Street Journal : “Perhaps the most illogical and least sustainable aspect of organic farming is the exclusion of ‘genetically modified organisms,’ but only those that were modified with the most precise and predictable techniques such as gene splicing.”
Miller noted that “virtually all the fruits, vegetables, and grains in our diets have been genetically improved in one way or another, often through wide crosses, which moves genes from one species or genus to another in ways that do not occur in nature.” To exclude genetic engineering “makes no sense. It also denies the consumers of organic goods nutritionally improved foods such as oils with enhanced levels of omega–3 fatty acids.”2
Biotech can meet other needs. Promising T-cell therapy protects patients from infections when they’ve undergone a bone marrow transplant. Gene research with trees offers variations on existing trees that would make them better equipped for climate change. New forms of algae or other plants may be used as carbon dioxide sponges, to remove centuries’ worth of the toxic by-product of combustion.
Once the genome is better understood, technology may give you options as to which sperm combined with which egg will produce a child free from regressive genetic anomalies. This technology may also be applied to height, weight, hair color, and intelligence.
Researchers are already exploring ways in which knowledge that has been trained into one rat is then communicated via computer to the brain of another untrained rat. The second rat learns the training faster than the original rat. If the trained rat is rewarded every time the second rat passes a test, the transfer of knowledge happens even faster. It is stunning to imagine a world in which the knowledge of one human can be passed on to another by connecting the two brains. What are the implications for education and society when you absorb the knowledge of college graduates or accomplished artists?
Traditionally associated with factories and assembly line work, robots will soon walk out of the factory to perform myriad functions. These include making advancements in areas such as health care and education.
In health care, new prosthetics can take the place of lost limbs that offer the flexibility and precision similar to that of the actual limb. This can dramatically improve the quality of life for lower-limb amputees, and give them the unprecedented ability to walk and climb stairs. In addition to sophisticated physical technology, researchers have developed a chip that connects to the central nervous system so that mechanical and electrical devices respond to the thoughts of the human running them.
In state-of-the-art operating rooms, minimally invasive surgery uses computers, electronic equipment, and robotic instruments to perform “keyhole” surgery instead of traditional open-incision surgery. Instruments are inserted and manipulated through small incisions using remote optical or video guidance, greatly reducing patient trauma and recovery times. It will become standard practice for a physician to perform surgery on patients many miles away.
In the classroom, robotic devices are already helping teachers demonstrate math and science’s core concepts. Without any prior experience in robotics or computer science, teachers can demonstrate abstract concepts like slope, sine, cosine, and vectors. Students find robots fun and engaging, and as visual tools they improve the information retention by up to 400% compared to traditional methods. Robots can also help educators meet the learning needs of children with learning disabilities, autism, and other pervasive developmental disorders. Robots can draw on infinite patience.
With congestion and gridlock a problem in the cities of virtually every developed country, the impact of intelligent transportation systems will be enormous. Soon there will be autonomous vehicles moving around the streets and highways, guided by electronics and not the person behind the wheel. Google’s autonomous vehicles have completed over 500,000 kilometers of road tests in the United States. The only accident was when a Google car was hit from behind at a red light.
Further, according to a University of California study, every car-sharing vehicle made available by companies such as Zipcar replaces 9 to 13 cars. So combine autonomous vehicles with new incentives for ride-sharing to exploit excess capacity in cars, along with low-emission vehicles, and we could have a “virtual” public transportation system for entire cities with almost no cost to government.
Siri, the personal assistant built into iPhones, and Watson, the IBM supercomputer that won a Jeopardy game against the show’s most successful players, are often thought to be examples of artificial intelligence (AI), but technically they are not. The computer Watson essentially uses brute computing force to answer questions. It doesn’t “understand” a question in the way a human person does. Watson applies a keyword search against an enormous database, much like Google does. But the distinction between artificial intelligence and Siri-like programs will grow less and less consequential as both computing power and the efficiency of data storage soars.
Decades ago, a wave of computer-generated automation wiped out blue-collar jobs. This was followed by a wave of outsourcing that affected some white-collar workers. Now a new wave of very clever machines will be displacing knowledge workers. What happens when a computer can analyze thousands of x-rays with greater accuracy than a physician?
Soon we will be wearing much of the technology that we use daily. This will include fitness and physical activity trackers, mHealth devices, smartwatches, e-textiles and clothing, and virtual reality glasses. Sony has even filed a patent application for “SmartWig.” It can be worn “in addition to natural hair”, and will process data and communicate wirelessly with other external devices. Stay tuned for “SmartBeard” to monitor your caloric intake.
The wearable technology that will likely have the biggest impact is the virtual reality headset. These headsets are usually thought of in the context of gaming, giving the game player the sensation of being immersed in a situation, such as driving a car or fighting a dragon. Facebook made headlines earlier this year when it bought a headset start-up, Oculus VR, for $2 billion.
“After games, we’re going to make Oculus a platform for many other experiences,” Facebook CEO Mark Zuckerberg wrote in a blog post. “Imagine enjoying a courtside seat at a game, studying in a classroom of students and teachers all over the world, or consulting with a doctor face-to-face—just by putting on goggles in your home.”
Now that the smartphone and tablet market growth is showing signs of slowing, many companies view wearables as the next major wave of consumer electronics.
There is already a Wearable Technologies Innovation World Cup in which innovators from around the globe are invited to submit their contributions in wearable technologies in the categories Sports & Fitness, Health Care & Wellness, Gaming & Lifestyle, and Safety & Security.
Manufacturers are teaming up with fashion designers to make their devices chic and trendy. Manufacturers know that aesthetic appeal will be a large factor in driving consumer acceptance. Years ago, MedicAlert understood that its users would be more likely to wear the bracelet if it looked somewhat stylish. The same is now true for emergency devices such as those that can alert a caregiver if a disabled or elderly person falls and can’t get up. Rather than look like clunky and utilitarian medical devices, companies are hooking up with designers to make emergency bracelets look chic and those worn around the neck look like jewelry.
In a few years, perhaps those looking for romance at a nightclub will wear a necklace or sweater that will glow to indicate a person’s interest in a potential partner. When budding romantics make eye contact with their Google Glass, digital fireworks could occur.
Your smart toilet, doing chemical analysis, combined with your smartwatch that monitors your temperature, heart, and blood pressure, will make most regular checkups with your doctor obsolete. Even then, when abnormalities appear, a conversation with Watson or equivalent may be the next logical step in diagnosis.
Scientists are now able to deal with technology at an unimaginably small scale. A nanometer is one-billionth of a meter. At this size, even the most powerful light microscope doesn’t work. Nanotechnology is making advances in a wide array of fields, including electronics, construction, and medicine. In health care, nanotechnology opens the door to revolutionary new tools to diagnose patients, gene therapy, and drug delivery. Many illnesses take place at the cellular level and nanotechnology can deal with the illness at that level.
Researchers at Harvard Medical School have crafted a nanorobot from DNA that can carry a small number of molecules. It can precisely deliver beneficial molecules that cause leukemia and lymphoma cells to essentially commit suicide.
In 20 years, most video entertainment will be on demand. Traditional television will no longer exist, and programming will become just another application on the Web. Just as today’s consumers turn to the Web to book a hotel or download a song, soon we’ll all look to the Web for our video content. The notion of gathering with family or friends to watch a certain program will apply only to live sporting events or shows such as the Academy Awards. And even these events have become interactive, as many viewers add commentary through Twitter. All content will be delivered by the Internet to computer screens, iPads, smartphone screens and, of course, the large OLED 3D TV screen in the living room.
It’s not just the delivery system that will change but also the content itself. The 20/20 version of Monday Night Football is an app, usable from any device. Chances are it would also be a 3D hologram playing out on your floor or table top. You could do your own instant replays from multiple cameras, get stats in real time throughout the game, or search for flashbacks from previous games. Stop the action and zoom in on the running shoes of your favorite player to find out about their brand, features, and price—and then execute a purchase.
You’ll probably have input into the coaching. Others will have access to your views. If you consistently have good commentary, you’ll be chosen by many as a broadcaster. If your audience is big enough, you’ll generate your own advertising revenue for your punditry. Coaches and players will each have their own communities that you can join.
Enormous effort will continue to be expended in the quest for clean energy. One project is called “Solar Roadways.” The project seeks to develop modular solar panels that are so cheap and durable that they could be used to transform every roadway, parking lot, landing strip, bike path, driveway, and playground into giant solar panels. A solar grid like this would generate clean power and among other things cut carbon emissions by 75%. Sure we may not pave America with solar panels anytime soon, but the 2014 Indiegogo crowdfunding campaign of “Solar Roadways” shows that a large number of people are enthusiastic about innovative approaches to energy.
There are those today who wish to live off of the energy that the sun supplies to the panels on their roofs. Self-sustainability, smart grids, and public networks may soon make such a dream a reality.
Technology has brought amazing benefits, and most people believe this will continue to be the case. In a study by the Pew Research Center, Americans were asked their general views on the future long-term impact of technology. Technological optimists outnumbered pessimists by two-to-one. Six in ten Americans (59%) feel that technological advancements will lead to a future in which people’s lives will be mostly better, while 30% believe that life will be mostly worse. The optimists tended to have more education and a higher income.
While the general public is favorably disposed toward technology’s advances, “experts” aren’t so sure. Another Pew Research Center project surveyed 1,400 high-profile technology thinkers about the future and the results are sobering. They saw the potential for governments to curtail Internet freedoms and monitor the activity of citizens 24/7. They also worried about the commercialization of everything online and much of the control of the Internet being usurped by corporations. Lee Rainie, director of the Pew Research Center’s Internet Project, told the New York Times that those surveyed had a “palpable sense of dread” about what may happen to life online. Rainie did note that the research was done when the news was dominated by the National Security Agency’s (NSAs) spying on ordinary citizens.
As a society, we have only scratched the surface of the digital revolution. But clearly as the pace of change accelerates and explosive new technologies surpass our ability as a society to comprehend them, we must be vigilant that technology serves us and doesn’t become a tool for our enslavement. We must constantly remind ourselves that the future is what we make it. More on this later.
Notes
1. Matthew Sparkes, “The Coming Digital Anarchy,” June 9 2014, The Telegraph Online.
2. Dr. Henry Miller, “Organic Farming Is Not Sustainable,” Wall Street Journal Online, May 15, 2014.
3.133.140.153