HOUR 24
Where Are Networks Going from Here?

What You’ll Learn in This Hour:

The growth of Linux

The likelihood of universal wireless “hot spots”

The computing “cloud”

The convergence of advanced technologies

We’ve covered a lot of territory these past 23 hours. I hope you’ve found this time well spent. For the final hour, we’ll look into the crystal ball in an attempt to foresee what’s on the horizon for computer networks.

Continued Growth of Linux

Both private companies and government have increasingly looked to Linux as an alternative to vendor-specific products that have been a mainstay of the industry. Linux provides an open standard for development and interoperability of computer operating systems. The pluses of using open source software platforms have been mentioned several times in this book, which are the reasons for adopting Linux. It’s an open, nonproprietary platform that doesn’t require proprietary hardware (although some flavors of Linux only run on certain vendor hardware products).

Linux is making inroads in commercial and governmental installations. According to a survey published in Wikipedia, Linux’s market is growing rapidly, and the revenue of servers, desktops, and packaged software running Linux is expected to exceed $35.7 billion in the near future.

Smaller companies often look for platforms that are easy to install, configure, and support. There’s sometimes an attitude of “if it isn’t broken, don’t fix it.” This means that companies may still be using the same network and service platforms they’ve used since their businesses were first computerized. For example, many NetWare installations around the world are using older software.

Some large institutions haven’t made a move to the PC platform and are still maintaining cumbersome and costly legacy mainframe systems. Again, some institutions just haven’t seen a compelling reason to turn their operations upside down to embrace a new technology.

Nonetheless, the trend is for an increased use of Linux. It currently holds around 13% of the server market. In addition, the XO laptop project of One Laptop Per Child is creating a huge market for Linux. The project’s goal is to provide a Linux-based laptop to several hundred million school children in developing countries.

Nationwide Variation of Wi-Fi: WiMax

Several communications companies have announced a plan to create a nationwide wireless network called WiMax. The goal is to extend the range of the current 802.11 Wi-Fi technology to more than a square mile from the cell phone tower. It will provide fast connections to the Internet and operate at around 15 megabits per second (Mbps).

Headed up by Sprint, and under the company Clearwire, WiMax is intended to saturate the United States with cell towers to, as the marketing ads claim, “...turn the core of North America into one, big hot spot.” By 2010, it plans to have the services available to about half the U.S. population.

This plan is a logical and perhaps inevitable evolution of mobile, wireless networks. Its realization will spur growth and the proliferation of mobile wireless interfaces in more end user devices, such as PDAs, gaming machines, digital cameras, smart meters, perhaps dipsticks in gasoline tanks, and yes, perhaps that wireless toaster we joked about earlier.

This network system will change the playing field in the wireless industry. AT&T and Verizon will be affected, as will Comcast and Time Warner. The latter CATV companies have no comparable wireless network strategy. In addition, several major enterprises are investing in this plan, including Google, Intel, and interestingly, Comcast and Time Warner.

We’re at the tip of the iceberg with Wi-Fi and user machines. Worldwide shipments of electronic products with wireless capability totaled about 1.7 billion in 2008. By 2012, shipments are expected to total 2.3 billion, and this figure is a conservative estimate.

Computing with Clouds1

1 Some of the facts and figures cited in this part of the hour are sourced from “A Special Report on Corporate IT,” The Economist, October 25, 2008.

One of the current buzzwords being bandied about is “cloud.” We’ve used an icon of a cloud in several figures in this book, partially to make the point that users don’t care—nor should they be concerned—with the operations inside a network cloud.

Today, the term cloud still conveys this idea, but it goes further. The cloud will eventually be viewed as an immense set of services that will be offered to users. The services will go beyond conventional email and web servers; they’ll expand beyond YouTube, Facebook, and blogs.

The basic idea of the cloud is simple: Offload work from users’ devices onto thousands of servers that will operate in the clouds. This work includes not only software applications, such as spreadsheets, word processing, and file management, but also services such as simulation models, and packages for taking care of users’ databases, photographs, and movie libraries. In essence, and ideally, the cloud will become a technologically advanced, yet benevolent Big Brother.

Rationale for the Clouds

Presently, many organizations don’t use their computing facilities to their capacity. It’s estimated that about 7,000 large corporate data centers exist in the United States. As an average, only 6% of their server capacity is used. Some centers are no longer using some of their servers. Other centers don’t even know what’s running on them! The cloud takes over these responsibilities.

In addition, many companies—especially small businesses—don’t have the size or economies of scale to warrant a data center. Once again, the cloud becomes a virtual corporate data center for these companies.

Examples of Emerging Clouds

In 2006, Amazon, the online retailer, started Amazon Web Services (AWS), which allows customers to “rent” computer time and software, as well as storage space. AWS is a good example of the features of a cloud. First, because of a huge server farm, a customer is provided scalable capacity, with AWS-provided computing power during spikes of usage. Furthermore, the fees are based on usage. Also, the cloud servers provide extensive redundancy and backup features. In addition, AWS and similar clouds provide a range of security services.

Google is also moving to cloud services. Although the company wouldn’t confirm The Economist’s estimates, it’s said that Google is operating more than three dozen data centers, which contain more than two million servers. As well, Microsoft is investing billions of dollars into server clouds and is adding up to 35,000 servers per month to its systems.

Potential Problems and Opportunities with Clouds

If the ideas behind clouds sound too good to be true, that’s because they may be. The main problem with a company relying on clouds for its IT is that it must rely on clouds for its IT. I don’t mean to be flippant, but it’s a sobering notion to realize our sacrosanct data—the essence of our professional existence—is beyond our control. A full commitment to the cloud requires a huge leap in trust and faith.

Yet, are you and I any better prepared or more competent to manage and back up our company’s data than, say, Google or Amazon? Are we better attuned to protect the privacy of our organization’s information? Only you and I (and our company’s officers) can answer these questions.

Nonetheless, I recommend that you and your staff delve into the “cloud.” It’s becoming a force in the industry, and you may find the economics of a virtual IT just too compelling to pass up. If so, you can put this book into the trash can (please don’t!), because most of the networking tasks will be moved to that nebulous cloud. But it still behooves a company to understand the interworkings of the cloud, even if the cloud providers want to make the cloud opaque to customers.

Computers, Bioengineering, and the Clouds

Beyond the immediate horizon of three or four years into the future, if we try to detect longer visions in our crystal ball, some fantastic scenarios might come forth—perhaps apparitions! One example is the emerging technology of protein-based computers.

Efforts began in the 1990s to create libraries of programmable DNA parts, a discipline called synthetic biology. The goal is to create general components that can be used in more than one DNA application.2 These plug-and-play parts could be like a generic capacitor used in electrical hardware circuits, or a generic perpetual calendar routine used in accounting software packages. Researchers have dubbed this technology “BioBricks.”3

2 W. Wayt Gibbs, “Synthetic Life,” Scientific American, May 2002, p. 75–81. And “Life 2.0,” The Economist, September 2, 2006, p. 67–70.

3 Ibid., “Life 2.0,” p. 67.

The first efforts have focused on assembling Boolean logic gates in certain microbes, those that could digest dangerous chemicals, such as TNT or carcinogens. The microbe not only locates the substance but glows as it does so, thus signaling it has found a land mine or a dangerous part of a land fill. The systems can operate inside living cells, deriving their energy from their hosts. Some of them can move and reproduce.

In addition to the TNT and carcinogen “sniffers,” other early synthetic biology devices are made of artificial amino acids and remove heavy metals from waste-water. Later systems remove plaque from arterial walls. And some perform binary logic! This application is of great interest to the scientists because of its relationship to computer architectures. Let’s examine this part of the technology in more detail.

At the simplest level, a protein (coded by another gene) is input into a distinct section of DNA to produce (say) a Boolean NOT operation. An inverter gene produces a protein output if it has no input and, conversely, produces no output if it receives input—just like a Boolean NOT gate on a computer. It’s that simple and that extraordinary.

It’s also complex. Researchers believe it will take many years to refine this process to allow the creation of building blocks of biologic binary circuits, the vital combinations of Boolean gates that make up the architecture of computers. The task is daunting. For example, the engineers must cope with the fact that their living machines will likely mutate as they reproduce.

Nonetheless, extensive research is being conducted on building computers made of “natural” parts. Already, brain surgeons have placed conventional computers inside humans’ brains to fix problems and abet other operations. Some scientists are banking that implants of wireless devices into the brain are only a matter of time—giving a new meaning to “head set.”

Fantastic? Certainly. Unrealistic? Not necessarily. Viewing the progress we humans have made with computers and networks in the past three decades, we should not close our minds to these possibilities. It may well be that the convergence of WiMax, computing clouds, DNA tailoring, and brain bioengineering will propel us into yet more fanciful networks.

As an anonymous source once said, “In the past, the present was judged to be an improbable future.” That’s a fine thought to keep in mind about the future of computers and networks.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.15.143.207