7

_______________

The Wireless Revolution

 

 

 

The Origins of Wireless

Less than twenty-eight years after Samuel F.B. Morse invented a practical telegraph in 1837 and thirteen years before Alexander Graham Bell patented the telephone in 1876, James Clerk Maxwell, the greatest nineteenth-century physicist, presented a paper that would revolutionize the world. Addressing the Royal Society, Maxwell presented a theory showing that electromagnetic fields could be propagated through space as well as through conductors. Moreover, they could travel as waves with the speed of light (186,252 miles per second). In his view, light itself was electromagnetic radiation within a certain range of wavelengths. It followed that it was theoretically possible to propagate light as well as electromagnetic waves of other wavelengths. It was not until the experiments of Heinrich Hertz in 1887–88 that electromagnetic waves were generated, detected, and measured. Hertz, however, was concerned only about scientific discovery, eschewing any interest in the economic implications of his research. Based on Hertz’s research, the English physicist Oliver Lodge undertook the next step, demonstrating in 1894 an effective and practical system of wireless transmission and reception. But Lodge, too, failed to see the commercial possibilities—although he later did participate in such ventures. The early technology of wireless was then in place.1 Physicists would continue to investigate the theoretical basis of radiation; but the theory of the electromagnetic spectrum was now ready, and the world awaited the insights and business acumen that would develop the practical application of radio waves.

Guglielmo Marconi was the visionary who saw the commercial opportunities of radio. In 1896, at the age of twenty-two, he was granted the first radio patent and soon demonstrated that it had a practical application for ship navigation. But the crucial step occurred in December 1901 when Marconi transmitted the letter s in Morse code from Cornwall, England, across the Atlantic Ocean to Newfoundland. Confounding many critics Marconi’s effort demonstrated, first, that radio waves, transmitted without the aid of wires, produced and detected by electrical circuits, could travel great distances. Second, he showed that such waves could be translated into intelligible information.2 Marconi’s success led to the postulation of the ionosphere (or Kennelly-Heaviside layer), atmospheric layers of ions that refract waves back to earth. Marconi’s success and the commercial possibilities it opened induced many other inventors and entrepreneurs to focus their efforts on radio. Among the most important set of discoveries associated with inventor Lee de Forest was the development of vacuum tubes that could be used in voice-frequency amplifiers, among other purposes.

It is a long way from transmitting the letter s in Morse code to transmitting music and intelligible speech. AT&T was to play a major role in this progress because amplification, modulation, and other technical problems were conceived as common to wireless and long-distance wire communication, to which AT&T had become committed. In fact, when Alexander Graham Bell, seated in New York, replicated his famous statement to Thomas Watson, seated in San Francisco, “Mr. Watson, come here, I want you,” a modified version of de Forest’s vacuum tube was used to fulfill modulation and amplification functions. Nevertheless, early on AT&T did not seriously consider that wireless would become a serious competitor to wire communication for point-to-point conversation because of the ease with which third parties could eavesdrop on private conversations. Accordingly, wireless was conceived as a supplement to wire in point-to-point communication. Additionally, transmitting long-distance radio entailed enormous power requirements compared with wire communication, imposing major economic as well as technological difficulties.3

The Rise of Radio and Television

Nevertheless, broadcasting and point-to-point communications across areas that wires could not practically traverse were sufficiently potentially large markets to encourage considerable research in radio. There was still a pot of gold to be had. Pittsburgh, Pennsylvania, was the home of two major developments that led to broadcasting becoming a major new industry in the 1920s. In 1906 Reginald Fessenden of the University of Pittsburgh developed a continuous-wave generator in which voice currents were superimposed on a carrier wave. (Continuous waves are those generated by pure tones.) Demodulation would, therefore, allow the sending voice to be received and understood. On Christmas Eve, 1906, Fessenden broadcast a music and speech program that was picked up by several ships in the North Atlantic. Capitalizing on Fessenden’s work, Frank Conrad, the engineer in charge of Westinghouse’s wireless research during World War I, innocently began broadcasting phonograph records in the Pittsburgh area during spring 1920. Soon, a growing number of listeners began making musical requests. Next, Conrad devised a regular schedule for broadcasting records. Exhausting his personal collection, Conrad borrowed additional titles from a local record store in exchange for which the store’s name and location were mentioned on the air.

The commercial possibilities became clear when the record store discovered that its sales of recordings played on the air far exceeded those that were not. Part of the explanation for this development was that a local department store advertised amateur radio sets for ten dollars in Pittsburgh newspapers. Home’s Department Store had set up a demonstration receiver in the store, which led to great sales success. Some of the lessons of these events were not lost on Westinghouse, Conrad’s employer, which had been looking for a way to enter the wireless business. Accordingly, Westinghouse began operating a station in East Pittsburgh on November 2, 1920. Westinghouse’s objective at that time was to sell nationally radio receivers that it would manufacture. KDKA, the station, was seen as an adjunct to the manufacturing enterprise. Even though from a contemporary perspective the economic potential of radio advertising was another lesson that could have been drawn from Conrad’s experience, the culture of early 1920s America would have viewed such an effort as an affront to privacy.4 Nevertheless, KDKA ushered in the 1920s radio boom.

Westinghouse was not to play the dominant role in shaping the structure of American broadcasting, the nature of government supervision, and its gradual complete divorce from wire communications into a discrete market. Not until the end of World War II would wireless technology become important again in point-to-point communication. And while AT&T attained the dominant position in wire communication, three networks presiding over a large number of affiliated radio stations would dominate radio, although not as fully as AT&T controlled wire communications. Entertainment, broadly defined, would become the principal purpose of radio. Stations would be privately owned, in contrast to the European system of public ownership. The most important stations would be tied together into nationwide networks. Radio advertising would become the stations’ and networks’ principal source of revenue. While rate regulation became the most important government activity in the realm of wire communication, there would be no rate regulation in the radio market. Rather, licensing would become the crucial activity in that market. Radio and television broadcasting would grow into enormous businesses with vast audiences. All of these developments occurred within the short period between KDKA coming on the air and Franklin D. Roosevelt’s assumption of the presidency in 1933. And while many individuals played roles in shaping the exploding radio business, two were paramount: David Sarnoff and Herbert Hoover.

Sarnoff’s life is the quintessential rags-to-riches story. Born in a dismal Jewish ghetto in Russia in February 1891, he emigrated to the United States in 1900. Almost immediately Sarnoff began work selling Yiddish-language newspapers, while in his remaining time studying a variety of topics and mastering English. By 1906, his father’s imminent death ended his education at the eighth-grade level and forced him to become the chief breadwinner for his family. An error Sarnoff made in applying for a job changed his life and the history of American wireless communication as well. Instead of applying for an office-boy job at a newspaper as he intended, Sarnoff mistakenly applied for similar employment at the Commercial Cable Company located in the same building. Offered a job, Sarnoff soon rose to become a Morse-code operator. After, moving to the American Marconi Company as an office boy, Sarnoff remained with the company and its successor, the Radio Corporation of America, for more than sixty years.5

Sarnoff rose rapidly in American Marconi. His energy, his vision, and not least his friendship with Guglielmo Marconi assured this. As early as 1915, when he was assistant traffic manager, he wrote, “I have in mind a plan of development which would make radio a ‘household utility’ in the same sense as the piano or phonograph…. The receiver can be designed in the form of a simple ‘radio music box’ and arranged for several different wave lengths.”6 The Woodrow Wilson administration, which had seized control of radio during World War I, was anxious that an American firm, not a British subsidiary, should control radio in the United States. Hence it arranged for General Electric and other wireless patent holders to create RCA in 1919, which in turn would purchase American Marconi from its British parent. When the broadcasting boom began it took the various RCA shareholder firms, including AT&T, by surprise. When AT&T disposed of its RCA stock and established several radio stations in competition with the RCA group, a complex battle between the two groups ensued. The end result was that AT&T bowed out of radio in 1926, selling its stations to RCA.7 At that point RCA was well on its way to constructing a network. The National Broadcasting Company (NBC) was established in 1926 and reorganized into two semiautonomous networks (Red and Blue) in 1927. By 1930 Westinghouse and General Electric had relinquished control of NBC and in 1932, pursuant to an antitrust decree, it became a wholly owned subsidiary of RCA. The Blue network was sold in 1943 under pressure from the Federal Communications Commission and was renamed the American Broadcasting Company (ABC) in 1945.

The final step in setting up the basic structure of the American broadcasting industry was the creation in 1927 of the Columbia Broadcasting System (CBS). Several stations excluded from the NBC networks, seeking syndicated programming that could fill their time, established the United Independent Broadcasters, which was then transformed into CBS. Just as Sarnoff dominated the affairs of NBC, William S. Paley dominated CBS from the time he bought controlling interest in September 1928. Not until 1977 did Paley step down as chief executive, having transformed CBS and its member stations into an enterprise incomparably more valuable than what he had paid for the network and its directly owned stations. While Sarnoff had a larger vision of broadcasting and far more technical knowledge than Paley, the latter’s marketing strategies were extraordinarily innovative. Although AT&T had devised radio advertising in 1922, when Paley took over CBS the network’s sales and techniques of advertising were greatly expanded so that it gradually became the principal source of network revenue.8

The most important focus of wireless communications by 1930 became AM broadcasting, and the American pattern differed sharply from that of other developed countries. The private ownership of competing networks and stations financed primarily by commercial advertising contrasted with the advertising-free publicly owned stations in other nations, where government subsidy and fees imposed on owners of radio sets (and later television sets) were the principal sources of revenue. Second, in contrast to the few stations in other nations, the United States developed a large number of independent stations, some of which became linked together in networks smaller than ABC, CBS, and NBC. Third, although wireless was employed in safety, dispatch, and a few other specialized services, the dominant use was in commercial broadcasting. Fourth, the star system that had been developed in the movies was adopted in AM radio with great success; millions of listeners week after week heard Jack Benny, Amos ’n’ Andy, and other stars’ shows. Fifth, as a result of telephone hookups and national advertising, the networks were able to successfully syndicate programs throughout the country.

But one of the most interesting characteristics of American broadcasting occurred because the AT&T model was rejected when that company withdrew from broadcasting.9 Perhaps the best way to understand AT&T’s concept of toll broadcasting is to compare wire communication and network broadcasting. In the former case, telephone companies distribute information (including switched) between communicating parties. The telephone company cannot censor messages or determine who can use the system, subject, of course, to the payment of fees and other moderate regulations. In contrast to telephone, radio transmission equipment was very expensive and could not be supplied to subscribers on the same basis. Thus, unlike two-way telephone usage, the radio was one-way and effectively barred would-be users from transmitting what they wanted to communicate. They could receive messages because receiving equipment, even in the primitive stage of radio, was very cheap relative to transmitting equipment. Radio, from the perspective of one seeking to communicate with others, had only half the capability of wire telephony. Instead of intercommunication, broadcasting consisted of active transmission and dumb reception. Moreover, persons controlling transmission can effectively exclude anyone they want from providing messages or can censor what is transmitted. Toll broadcasting was intended to partly redress the differences between wire and wireless. Under toll broadcasting the station sold blocks of time to purchasers who could use it in any way they saw fit, subject, of course, to prevailing canons of decency. Nothing prevented the further subdivision of time, or reselling it for that matter. AT&T envisioned a network of local stations connected by long-distance wires so that the same program could be heard simultaneously in many places. But, unlike the present permanent system of affiliates, such networks could be constructed on an ad hoc or temporary basis. Finally, allocations of time would be reserved for public service and cultural programming.

Toll broadcasting ended with AT&T’s withdrawal from the industry. Instead we have had a system in which stations and, most importantly, a few networks control not only the distribution of information but its content as well. Much of the current resentment against the media elite and their control of information might have been averted if toll broadcasting had taken hold. The network system that developed in AM radio continued when the VHF monochrome television boom began its takeoff after World War II. The three networks came to dominate the new medium, with a scattering of local stations offering less popular programming in many markets. And the same pattern persisted in color television after the F.C.C. ultimately resolved a bitter dispute between CBS and RCA, known as the color wars, over which system would be employed, in the latter’s favor. Commercial color television broadcasting was authorized to begin in January 1954. Sarnoff’s faith in color television was not, however, justified until 1964, when at last color television sets began to sell in great numbers. From that time until the cable television explosion, the commercial system that Sarnoff envisioned from the advent of primitive commercial radio prevailed just as broadcasting, not point-to-point transmission, was the dominant use of wireless.10 Ironically, as cellular telephony and cable television grew dramatically, wireless became more widely used in point-to-point communication, while cable began to break the long-standing network dominance.

Before turning to the point-to-point wireless market, two additional markets should be briefly examined. FM radio and UHF television, different as they are, were both expected to challenge network dominance and reduce concentration in the broadcast industry. The story of FM is a very complex and detailed one. Major Edwin H. Armstrong, a brilliant engineer and holder of many telecommunications patents, more than any other person was responsible for FM’s development. Originally a close friend of Sarnoff, the two men became bitter enemies as Sarnoff perceived FM as a threat to the network system that had evolved.11 In 1945, reversing a prewar decision on where to locate the FM broadcasting band, the F.C.C. effectively wiped out the market for FM sets already made; the FM industry had to start from scratch. Coupled with network focus on AM and television and a lack of programs, FM stagnated until 1957, when at last the industry gradually took off. Not only do the three major networks provide service, but other successful networks have thrived as well. UHF television’s story has not been as fortuitous. Plagued by generally poorer reception than the VHF band, UHF was unable to make major inroads in the market notwithstanding a 1963 law requiring all television receivers made thereafter to be able to receive both VHF and UHF channels. Because of UHF’s poorer reception, it had generally not been able to obtain popular programming. The rise of cable systems, which created a more level playing field in reception, has aided UHF prosperity.12

The Rise of Spectrum Regulation

As the foregoing section shows, entertainment became the dominant use of wireless transmission after the radio boom began in the 1920s. But it was not the only use. Ship-to-ship and ship-to-shore use for safety and navigational purposes preceded the employment of radio frequencies for entertainment. World War I triggered military applications, most importantly the United States Navy’s deployment of radiotelephonic communications for quick and flexible strategic and tactical decisions. That war was the first in which aviation played a major role. At the request of the U.S. Army Signal Corps, the Western Electric division of AT&T began work on the aviation radio problem in 1916. In July 1917 Western Electric was able to develop the first air-to-ground and ground-to-air transmissions (about two miles), and in August it successfully tested two-way communication between planes in flight. These wartime developments would have obvious peace-time applications, especially in commercial aviation and ocean shipping and travel. The next important application occurred in 1920 when AT&T developed a radiotelephone link between Catalina Island and the California mainland at Long Beach. Under normal circumstances, AT&T would have laid cable over this relatively short distance, but wartime conditions created a severe shortage of the requisite materials. Thus, for the first time wireless became a direct rival to wire. Notably, AT&T by that time had made substantial progress on the security issue to assure the privacy of point-to-point communication.13

The 1920s radio explosion coupled with the other uses found for wireless transmission and the likelihood that yet more would be found placed the issue of allocating the scarce resource of spectrum high on the agenda. Allocation through regulation was the method chosen as early as the 1912 Radio Act, when broadcasting was unknown and safety at sea was Congress’s primary focus. Under that statute the secretary of commerce and labor (later secretary of commerce) was given authority to issue licenses. After the broadcasting boom began, Secretary Herbert Hoover frantically tried to deal with the rampant interference problem, but was essentially prevented from doing so by a United States Court of Appeals for the District of Columbia circuit decision holding that under the 1912 act, the secretary lacked power to refuse a license to an otherwise qualified applicant, regardless of the interference created. Fruitless attempts at voluntary allocation and other Commerce Department court losses wreaked havoc in the system as more and more stations went on the air, using any frequencies and power they chose, regardless of interference.14

Hoover’s efforts bore fruit in the form of the Radio Act of 1927, which created the five-member Federal Radio Commission (FRC). The act provided that the agency could grant federal licenses to individuals or business organizations for limited periods of time. The statute’s test for a license award or renewal was whether the applicant’s service (in both the technical and nontechnical senses) would serve “the public interest, convenience and necessity.” Thus, the federal government surreptitiously got into the business of assessing content and effectively censoring speech—dangerous precedents in a free society and in complete contrast to the greater freedom of the print media. Wireless, in contrast to wire communication, was not treated as a public utility. Therefore rates were not regulated. The agency was also granted the powers to classify stations and prescribe the services to be provided by each class, and to assign frequencies, power, operating hours, and location. The licensing system created in the Radio Act carried over to the Communications Act of 1934, which absorbed the telephone functions of the Interstate Commerce Commission and the FRC’s wireless regulation, thus establishing a unified agency with regulatory powers over all means of communications.15

Before leaving the subject of radio regulation and the establishment of the FRC and the F.C.C., it is useful to compare alternative paths that might have been taken to the one constructed in 1927 (and even earlier, if one considers Secretary Hoover’s previous attempts). Continuing government licensing is based on the spectrum-scarcity principle; only government can be the umpire. But a landmark article in 1959 by R.H. Coase clearly shows that even at the time of the 1927 Radio Act there was an alternative.16 Coase argued that, prior to the act, the courts were in the process of creating a system of property rights for spectra in the same way they had created property rights for tangible property (such as land) and representational property (such as stock certificates and negotiable instruments). Had the government followed that route instead of licensing coupled with regulation, it would have auctioned off spectra in such a way that interference would have largely been avoided. A market would then have been created in spectra in the same way that land or stock certificates are subject to market operations. Coase belittled the viewpoint that spectra are physically limited in a way that land is not. After all, he asserted, the number of original Rembrandt paintings is even more limited, yet the market satisfactorily deals with their sale and use without a need for government regulation. He similarly rejected the interference argument, since one person’s use of land would interfere with another’s proposed use of that same land if a property system did not exist. Indeed, one of the major functions of the private legal system is to devise rules regarding interference, and one of the most important conceptions it has devised is property rights protecting exclusive use. Precisely the same principle could have been employed in the novel case of spectra during the 1920s.

Coase and others who accept this argument have concluded that a system of auctioning off spectra would have been far more efficient than a government regulatory apparatus. First, of course, there is the deadweight loss attributable to regulation and the apparatus designed to enforce it. Second, the danger of government censorship and arbitrariness is always present when an agency has broad discretionary authority. More important, a market system with property principles would lead to more efficient use of spectra, according to this view. But that was not the path taken.

The F.C.C., carrying on the work of the FRC, began early to allocate the radio spectrum. The process is a complex one. First, engineering issues must be addressed. Standards are devised for different frequency bands, the number of channels each will contain, their bandwidths, geographical locations, and power. The political part of the process constitutes assigning channels in specific locations to users for designated purposes. In this part of the process, the F.C.C. traditionally used a comparative hearing in which applicants provided evidence that reflected favorably on them, pursuant to the broad “public interest” standard. They could also provide information about an opponent’s adverse factors.17 As the 1930s proceeded the F.C.C. allocated frequencies for the so-called fixed services, including those concerned with safety (aviation, fire, police, forestry conservation, and so on), land transportation (automobile emergency, highway truck, railroad, taxi-cab, and so on), and industrial services (forestry, petroleum, power, motion picture, and so on). Finally, the F.C.C. allocated frequencies for amateurs’ and citizens’ uses. By 1954 the agency could justifiably claim that “usage now extends from the cradle to the grave. There are radio facilities for calling doctors and ambulances to the homes of expectant mothers…. At the omega of life, radio is utilized to dispatch vehicles in connection with death and burial.”18

Microwave Transmission

World War II and the period immediately preceding saw research that greatly enlarged the usable spectra. As we saw in Chapter 3, the use of microwave transmission (frequencies from 1 GHz to 30 GHz) greatly expanded the available spectra and undermined AT&T’s position. Such frequencies increase the communications capacity (bandwidth, or the rate at which information can be exchanged between two points in an analog network). As we saw in the Above 890 Mc decisions, the F.C.C. adopted a newly liberalized licensing policy for private microwave systems, rejecting the AT&T argument that such licenses should be granted only in exceptional cases. Virtually every big-business interest was, for the first time, arrayed against AT&T and the other common carriers, demanding the “free choice” to operate their private microwave systems. For example, the Automobile Manufacturers Association urgently claimed that “the highly competitive nature of the automobile industry makes it mandatory that privately operated systems be considered in comparison with common carrier charges.”19 Such arguments prevailed and the opportunities of private carriers for using the microwave portion of the spectrum were gradually expanded. In turn, as we have seen, this led to the applications of MCI and others to provide microwave service to customers, gradually leading to the offering of full-scale alternative long-distance services. Wireless transmission for point-to-point communication was on the road to becoming as important as it was in broadcasting.

The assumption that microwave would dominate long distance and find increasing application in local loops underlay policymaking in the AT&T Modification of Final Judgment and many F.C.C. decisions. In the case of local-loop communications, there were two reasons. First, microwave required no rights-of-way, a major problem in urban areas. Second, microwave was the most flexible way to add new communications points in the least time. More generally, digital technology was being applied to microwave; indeed, one study indicated that digital microwave transmission had “exploded” in the six years prior to 1983.20 Digital transmission essentially freed microwave of the major problems with which analog transmission was associated, including cross talk and inconsistent signal level noise. Moreover, going digital reduced short-haul (under 250 miles) costs compared with analog because of lower terminal equipment costs. For these reasons, the F.C.C. commented in a 1973 proceeding that “digital modulated systems should be the rule rather than the exception.”21

Accordingly, the ascendancy of digital microwave was widely conceived as inevitable. From digital microwave’s introduction in the mid-1970s until the AT&T breakup, it triggered rapid deployment of microwave. For example, in the five years preceding 1980, use of terrestrial microwave in the United States tripled from approximately 165,000 to 500,000 installed miles, with other industrialized nations experiencing similar growth.22 Two important policy implications were presumed to flow from microwave’s ascendance. First, as economist Leonard Waverman argued in 1975, the advent of microwave as a long-distance transmission medium dramatically altered the industry’s economics, so that the natural-monopoly justification for AT&T’s long-distance monopoly no longer existed as it had when open wire pairs were the principal medium for intercity traffic, which was the case until the late 1940s. Accordingly, he urged in an influential paper, increased competition was in the public interest.23 Breaking the AT&T long-distance monopoly was the inevitable consequence of this view. Second, at the time of the breakup the fear that microwave would allow large business subscribers to bypass the local loop was widespread. A 1984 Fortune article summarily conveys what was then the conventional wisdom: “Bypassing, which promises to grow fast and large, will mean reduced growth and diminished profitability for the local phone companies. Much of their revenue comes from a few large clients…. If several large customers set up bypass networks, profits are penalized because phone companies usually cannot cut costs as quickly as bypassing siphons off revenues.”24 Lower local rates for large customers and higher access charges from longdistance companies to local telephone companies were conceived as the most effective response to the threat, which would inexorably take its toll on the local operating companies.

In order to evaluate the impact of the alleged threat, it is initially important to distinguish between two kinds of bypass. The first is facility bypass, in which users build a microwave link or lease a line from a bypass vendor that connects the user directly to a long-distance carrier’s point of presence. Service bypass involves users purchasing private lines from the local operating company. In facility bypass the customer is completely lost to the local carrier, but in service bypass—use of which has been much greater than the former—the operating company still obtains considerable revenue from the customer. Operating companies have compensated for the loss by raising network access fees to residential subscribers.25 Aiding local operating companies, access charges that long-distance carriers must pay them amounted to about 40 percent of long-distance costs in 1991. By 1995 access charges accounted for almost $30 billion of local carriers’ $115 billion revenues. Further, service bypass and special rates for dedicated lines have enabled local operating companies to counter—or at least blunt—the bypass threat.26

Nevertheless, the bypass threat is a real one and likely to get worse as personal communications services, at which we look later, and other technologies come on line later in the 1990s. To date, however, the threat has not been as great as the doomsayers and prognosticators were suggesting in the immediate post-AT&T-breakup era. The past is, however, not necessarily a guide to the future for two reasons. First, bypass has, indeed, been increasing. For example, even though AT&T’s long-distance revenues decreased between 1984 and 1990, from $34.9 billion to $33.9 billion, the access charges it paid declined at a much greater rate, from $20.6 billion to $12.2 billion. The erosion continues as long-distance carriers encourage the growth of local bypass carriers.27 More importantly, the sound quality and reliability of wireless technologies have improved, and digital technologies will accelerate further improvements. At the same time, consumers have become more comfortable using wireless technologies. Typically in telecommunications, then, there are significant threats to wire communications, but the expectation that microwave would constitute the major one turned out to be wrong.

What then is the future of microwave transmission? First, some entrepreneurs have rediscovered microwave and are devising new technologies to deploy portions of the microwave spectrum competitively. For example, Associated Communications, a small firm, developed microwave transmitters that use the 18 GHz to 19 GHz band to transmit video, Internet, and data information. While its transmitter-receivers are fixed on buildings and must be in sight of each other, the equipment is relatively inexpensive to install and provides excellent data-transmission capability. Second, several telephone companies, instead of building expensive cable television systems to compete with cable providers, have acquired so-called wireless cable companies that use microwave frequencies to broadcast a limited number of television channels over the air. Third, cellular radio companies often use microwave radios in short-haul applications. Fourth, because of microwave’s reliability in disaster situations, it is commonly used as a backup system when the wire system goes down. Fifth, in situations where fiber-optic cable cannot be laid down or is extremely expensive, microwave is a typical substitute. Finally, short-haul digital microwave can provide superior transmission quality and reliability to large users than T1 transmission lines (lines that carry digital signals at a rate of 1.544 megabits per second) under some circumstances.28 Nevertheless, the bright future for microwave on which the Western Electric Modification of Final Judgment was based has not come to fruition. By late 1994, 95 percent of AT&T’s traffic was carried by fiber-optic cable. From the time of the consent decree forward, the wireless growth sector was cellular telephones.

The Cellular Revolution

World War II accelerated the development of wireless transmission. Motorola developed two wireless transmission devices—the Handie-Talkie and the Walkie-Talkie—that facilitated military communication, most importantly on the battlefield. Millions of veterans saw firsthand the effectiveness of wireless communications and many would-be entrepreneurs envisioned civilian uses. AT&T received regulatory approval to operate a commercial radiotelephone service in 1946. The service was fully subscribed quickly, and within one year similar systems were available in twenty-five other cities. But the central problem with land mobile systems was far too little spectrum available for the large number of potential users. In 1984 less than one-tenth of one percent of all motor vehicles was equipped with mobile telephone service, and the quality of that service was very poor. In New York City, only twelve vehicles could then engage in mobile telephone communications simultaneously.29 Like citizens band (CB) radio, the insurmountable problem with traditional mobile telephone was that a large number of users had to share a small number of frequencies.

The solution to the problem, like so many in the telecommunications field, was developed at Bell Labs. What triggered the development was the observation that television stations often used the same frequencies but were situated far enough apart to avoid or greatly reduce signal interference. The same principle might be used in much smaller areas for mobile telephony. In 1947 work began on cellular radio. In 1960 H.J. Shulte and W.A. Cornell, two Bell Laboratory scientists, proposed the first cellular concept, which notwithstanding increasing technological complexity still provides the basic cellular model. In their concept a metropolitan area is covered by adjacent hexagonal cells so that the entire region is included. The available mobile telephone channels are divided into groups, with each group assigned to specific cells. The same group of channels can be used in different cells if the cells are separated by a distance (reuse distance) so that cochannel interference in each cell is tolerable. Further, more channels can be assigned to higher-traffic cells.30

The breakthrough can best be understood by contrasting older mobile radio systems to the newer cellular radio (also called cellular telephone) system. As we saw, mobile radio was first employed in ship-to-shore and ship-to-ship transmission. By 1920 mobile radio was adapted for use in land vehicles and was used in police and public safety work. Spectrum shortage and the sheer bulkiness of the equipment largely precluded its use for other purposes. The World War II technological breakthroughs led AT&T to propose the commercial mobile radio system in 1946. The initial system was deployed in St. Louis, Missouri, but available technology limited the experiment to three one-way channels. The units employed a push-to-talk switch with all calls routed through a mobile operator. Because only three channels were available, subscribers had to have the patience of Job to get on line. Nevertheless, waiting lists to use the service were very long.31

Improvements occurred in the service to meet the enormous demand, while the F.C.C. established a competitive regime in 1949 by authorizing the new class of radio common carriers (RCCs) to compete with the established telephone companies on a different set of frequencies. By 1956 demand for the service led the F.C.C. to authorize seventeen more channels for the mobile telephone service. Technological developments in the 1960s allowed a customer to use any available frequency that might be unoccupied, thereby greatly reducing waiting time. In the same period the new “Improved Mobile Telephone Service” (IMTS) allowed automatic dialing without operator assistance and conventional two-way telephone conversation without the need to push a button. Increasing demand compelled the F.C.C. to allocate considerably more bandwidth to the mobile service in the mid-1970s. Nevertheless, as conventional mobile radio’s day was coming to a close, the consensual view was that it had provided an inferior service. “At the end of 1983 there were only 12 channels in New York City to be shared by 730 radio-phone subscribers, with a waiting list of at least 2,000. Even the lucky 730 often had long waits for a free circuit during rush hours. And quality was poor; callers sometimes sounded as if they were at the bottom of a well.”32

The traditional mobile phone service was based on a system in which the signals transmitted were strong enough to interfere with others. Therefore, just as in conventional radio, channels assigned to one area could not be used in nearby ones, sharply limiting the number of channels and, therefore, subscribers. In a metropolitan area all signals are transmitted to a central antenna. In contrast, under the cellular concept, a metropolitan area is divided into small areas—called cells—each with antennae and switching centers. Although any one frequency within a cell could handle only one call, the same frequency could be used for a different call in another cell because there is no interference between calls in different cells using the same frequency. Transmitters are very low powered and sufficiently distant from each other. Further, cells can be divided when call volume at any site becomes very heavy. A highly sophisticated switching system, handing off calls from cell to cell, prevents conversations from being interrupted as the cellular caller moves from cell to cell. As a caller moves along roads, the cellular system continuously monitors signal strength and direction. This allows a computer at the central switching office, with which the equipment at each cell is connected, to switch the call to another channel or cell or both. The cellular system allows the transmission of data as well, but in that case an add-on device is necessary to prevent loss of data during a handoff.33

Both businesses and individuals grasped cellular’s possibilities immediately. For businesses it transformed travel time into work time; cellular telephony, in short, became a major productivity tool. A widely publicized AT&T study on a matched sample of salespeople, half with cellular phones and half without, showed that the group with phones averaged $11,000 more in sales per year than the group without them. Ameritech, through its mobile phone subsidiary, began commercial cellular service in Illinois on October 13, 1983. In 1984 there were approximately 100,000 cellular subscribers. Five years later the number had skyrocketed to 3.5 million. But even that spectacular growth paled before the increases from 5.3 million in 1990 to 23.2 million in 1994, with the expectation that the number would reach 46.9 million in 2000.34 The only dark spot was that in the 1990s subscribers were on average making fewer calls. Finally, cellular’s spectacular record was not just an American phenomenon. Cellular use was rapidly expanding every-where, most dramatically in Latin America and Southeast Asia.

As one would expect, the battle to enter this lucrative arena, whose growth was foreseen at the outset, was a particularly intensive one. On the demand side, at the outset prices of equipment and service initially precluded widespread use. But as early as 1988 prices had dropped to levels that most analysts had not expected to occur until much later. As Business Week summarized, “Cellular phones already are becoming a shelf item for mass retailers.”35 Economies of scale and widespread foreign-equipment competition contributed to the decline. At the same time newer entrepreneurial companies, like McCaw Cellular, developed novel marketing strategies, such as paying a bounty to retailers who signed up customers for cellular service. Universally held early expectations of a booming industry had not only been met; they had been surpassed.

The F.C.C.’s early responses to the booming industry were criticized in some quarters, but an examination of how the agency shaped the structure of the industry indicates that the actions taken were at least plausible, and arguably better (or not worse) than alternatives not taken. Under the 1934 Communications Act, the agency was empowered to make spectrum allocations for a new technology and to construct a licensing system to determine which applicants would be awarded franchises. The F.C.C. was also empowered to impose technical standards for the equipment employed and how such transmitters and receivers are used. While the United States Supreme Court had admonished the F.C.C. that competition is the general preference in the use of wireless frequencies (in contrast to the preference then, but not requirement, for monopoly in traditional telephone service), the shape and nature of that competition was in the hands of the agency.36 Moreover, the agency’s structural options were complicated during the period that cellular policy was being considered by three other factors. The first is the advent of cable television and the beginnings of the traditional breakdown between wire and wireless service, with the former governed by public utility principles. The F.C.C., as we shall see in Chapter 9, was getting mixed signals from courts on how to treat cable. Second, during much of the period in which cellular’s fate was being decided, the more important issue of the ultimate structure of the Bell system was pending in federal district court. That the F.C.C. took a long time to make a final determination and that it changed course should not come as a surprise when we take into account the complexity and the stakes involved. Third, add to this the fact that every state public utility commission could play a role in the eventual outcomes. Because many states had regulated traditional mobile telephone service, the reasonable assumption was that they would assert jurisdiction over cellular.

As we saw, the cellular concept was first proposed in a 1947 internal Bell Labs memorandum. Formal publication of the basic ideas occurred in 1960. By 1970 two Bell Labs scientists had developed computer models of dynamic channel assignment. In 1971 further refinements were made in the engineering of channel assignment. AT&T continued its research into cellular problems through the 1970s. AT&T was, of course, anxious to exploit commercially the new technology that its Bell Labs had developed as soon as it became feasible. Accordingly, as early as 1970 AT&T proposed building the first high-capacity cellular system, which it called Advanced Mobile Phone Service (AMPS). AT&T’s proposal was submitted to the F.C.C. in 1971. AT&T’s submission was, in part, a response to the F.C.C.’s 1968 invitation to develop an “efficient high capacity” mobile system that would alleviate the worsening congestion taking place under the older system.37 In 1970 the F.C.C. authorized AT&T to test the system in Newark and Philadelphia, two densely populated cities with considerable traffic. In 1971 Bell Labs reported that the new system worked very effectively. In 1974 the F.C.C. approved the cellular telephone concept and allocated 40 MHz to it, adding that it would require cellular systems to be compatible. In this way cars would be able to roam from city to city and customers would continue to use their sets regardless of location.38

In its 1974 order the F.C.C. limited the awards of cellular licenses solely to telephone companies on the grounds that cellular was an adjunct to regular telephone service and that AT&T had incurred virtually all of the development costs. Further, the agency concluded that only telephone companies had the technological prowess to resolve the complex installation and operation problems. A court of appeals in 1976 upheld the F.C.C.’s decision to limit the new technology to wireline carriers. Between 1974 and 1979 the F.C.C., adopting a cautious approach, granted construction permits in a number of cities. Meanwhile, developments in other countries were moving more rapidly. Japan instituted a cellular system in the Tokyo metropolitan region in December 1979, while L.M. Ericsson, a Swedish equipment manufacturer, deployed major cellular systems in the Scandinavian countries shortly afterward.39 The standards in these systems were incompatible with the U.S. system.

The F.C.C.’s long-awaited decision was issued in 1981 after examining an enormous volume of findings and considering the views of numerous participants and commentators. Taking into account the argument made against a wireline company monopoly in each market that such a structure would thwart cellular development and eventual competition with conventional telephone service, the agency decided to award two franchises in each market. As we noted, the F.C.C. allocated 40 MHz of spectrum to cellular radio. Because technology then precluded splitting the allocation into less than 20 MHz blocks, only two systems could effectively operate in each market. One company would be a separated subsidiary of a telephone company operating in the area, while the “nonwireline” franchise would be awarded to one among competing applicants. Finally, since the telephone company franchise would be able to operate before the nonwireline carrier, the F.C.C. would consider delaying the head start.40 In order to move quickly, AT&T and GTE agreed not to compete with each other in the license awards, allocating markets between them. The F.C.C., similarly, encouraged the nonwireline companies to engage in such market division wherever feasible.

The next major event affecting the structure of the cellular market was the AT&T breakup on January 1, 1984, leading to the issue of whether AT&T or the new Bell operating companies should run wireline cellular service. Since a fundamental division in the settlement was between local service and all other services, cellular communication was obviously on the local side. Consequently, the seven RBOCs inherited the AT&T wireline cellular subsidiaries. Nevertheless, the overlap between the RBOC territories and the market areas in which the cellular carriers would operate was far from congruent. For example, northern New Jersey is part of the New York metropolitan area, but the area is divided between two different RBOCs. Accordingly, Judge Greene’s approval was required in such situations—a process that delayed operations for a considerable time after applications were made. But the new RBOCs went further. Since as we have seen the general expectation was that the cellular market would grow rapidly, and each RBOC was now independent of the others, the Bell companies began acquiring nonwireline cellular carriers outside their areas, which was permissible under the consent decree.41

The final issue at which we look is the method by which the F.C.C. would award the nonwireline franchises. The agency had traditionally used a comparative-hearing method in wireless situations—most importantly the award of local radio and television station licenses. Under this system the F.C.C. examines comparative financial data, experience, technological facility, and other factors in determining winners. This process has three major drawbacks. First, it delays awards that are contested by several applicants for a considerable time. Second, such proceedings are often so lengthy and complex that enormous resources are expended on lawyers, experts, and so on, rather than on the service itself. Third, the procedure discourages smaller start-up firms and those willing to take risks. Who, after all, is willing to enter a battle against a huge company with extensive technological experience and resources? But it was primarily the first drawback that led the F.C.C. to abandon the notion, envisioned in its 1981 decision, of comparative hearings. Instead, in an important 1984 ruling, the agency adopted a lottery method for nonwireline carriers.42 Since the competent and incompetent had equal chances to win the lottery in any market, the method led to an explosion in the number of applicants. Some successful applicants had no intention of entering the cellular business and promptly turned around and sold their franchises for enormous profits.

One successful applicant that retained its franchises would radically alter the conception of cellular telephony.

McCaw Cellular and the Growth of
Hypercommunications

In our look at the early development of the Bell system we saw the gradual shift of telecommunication from isolated local markets to transcontinental long distance. Theodore Vail’s vision and leadership triggered this dramatic change and the technological improvements that it demanded. In more recent times, the conception of telecommunications has become global. When cellular telephony began it was primarily conceived as local in nature. But just as Vail broadened horizons in wire communications, Craig McCaw, who led McCaw Cellular Communications from its infancy until its acquisition by AT&T, was one of the leaders in expanding the horizons of wireless communication. McCaw grew up in telecommunications. His father founded Centralia, Washington’s, first radio station in 1937 and began operating one of the nation’s first cable television systems in 1952. Until his death in 1969, John E. McCaw bought, sold, and operated numerous radio and television properties. In 1966 John McCaw began Craig’s communications career by selling Craig and his three brothers the Centralia cable franchise for no cash, taking back preferred stock. Craig was then sixteen years old. Young McCaw successfully operated the cable television system, but in 1981 the company’s revenues were about $5 million—a far cry from the $11.5 billion that AT&T paid to acquire McCaw Cellular in the deal that concluded in September 1994.

McCaw’s interest in cellular radio was piqued when he read in 1981 an AT&T filing with the F.C.C. that projected 900,000 cellular subscribers by 2000—a figure that, of course, turned out to be absurdly low.43 When the F.C.C. began its process of awarding cellular franchises, McCaw was awarded licenses in six of the top thirty markets in 1983. This was only the beginning. McCaw undertook a huge buying spree between 1983 and 1987, largely financed by borrowed money and the sale of the cable television properties. Debt in 1989 approached 87 percent of capital—an extraordinarily high figure. But McCaw Cellular continued buying and selling cellular franchises. In that year McCaw Cellular outbid BellSouth for a controlling interest in Lin Broadcasting, a major cellular operator with licenses in Dallas, Houston, Los Angeles, and New York. Behind the buying and selling, McCaw had a strategy to completely transform the cellular telephone industry from one with a local focus to a national network. Its operations were shaped in the form of market clusters that were intended to eventually form a national wireless network in the same way that AT&T had been gradually transformed from local systems into a national wireline network earlier in the century. Under the Cellular One system that McCaw devised, a person calling a San Francisco subscriber is automatically switched to that subscriber, who might be visiting in Miami. McCaw, in a word, was implementing a vision of primarily reaching people, not places. His grand plan, called the North American Cellular Network (NACN), would, like AT&T earlier in the century, invite companies that McCaw did not own or control to enter into the network that he hoped would embrace all of North America.

In order to implement the NACN vision, McCaw would have to install a more modern system than the analog one that characterized the first phase of cellular communication. At the same time the commitment to primarily reach people not places implied two other undertakings. First, McCaw would have to be able to reach people in places that cellular did not ordinarily reach, such as in airplanes. Second, McCaw would have to offer not just cellular voice communications but a full range of services that corporate customers would require, including paging, voice mail, and data communications. Corporate customers, McCaw realized, would be attracted to the idea of one-stop shopping for the full gamut of their communications needs. In order to carry forward the first leg of its strategy, McCaw announced in 1990 that it would replace existing switches (largely made by Motorola or AT&T) with new digital ones made by an Ericsson–General Electric joint venture. The uniform digital switching structure would allow information to move rapidly through the homogeneous system. At the same time the system would link with Rogers Cantel, the powerful Canadian cellular operator that already used such switches.44 Underlying this move was, of course, the idea that the company that controls an industry standard has a better—although far from certain—chance to dominate the industry than its competitors.

McCaw saw the need generally to move to digital transmission from AT&T’s AMPS concept that was devised in the 1970s. Only in this way would it be possible to achieve the goal of reaching a person anywhere in North America, wherever he or she might be temporarily located. During the 1980s, the AMPS networks were reaching the limits of their capacity. To provide extra capacity, D-AMPS (Digital Amps) was developed and brought into service in 1992, initially in Hong Kong. Ericsson gained an early lead in D-AMPS, one advantage of which was that it could be integrated into an AMPS system rather than having to fully replace it. D-AMPS and AMPS could coexist because they used the same radio frequencies. But in contrast to AMPS, D-AMPS is transmitted as a series of digital pulses with technology that allows far more information to be squeezed into the same frequency. By 1993 McCaw was offering the digital service in south Florida with conversion in other clusters following apace. Digital cellular not only permits distortion-free calls and raises wireless signal quality up to wireline standards, it also allows the wireless service to offer a full range of value-added services, such as paging, through the same customer equipment. McCaw was also a leader in transmitting data employing the cellular digital pocket data (CDPD) system. This system employs existing D-AMPS/AMPS networks, allowing wireless data and voice cellular service to be combined within the same frequency bands and channels. CDPD is able to detect and utilize the approximately 30 percent of the actual airtime when a voice channel is unused and send data in small packets and short bursts during these periods.45

As cellular phones became truly mobile and could be used away from vehicles, even while walking or riding a bicycle, another link in McCaw’s dream of being able to telephone anywhere at any time was communication between an airplane passenger and a ground station. Air-to-ground communication was pioneered by In-Flight Phone Corporation. This innovative company offered not only air-to-ground service, but a viewing screen and keypad with the telephone, allowing fax transmission, the ability to listen to radio stations and play games, and other information services as well. GTE Airfone, a division of GTE, also entered the airphone market and was soon joined by Claircom Communications, at first a joint venture of McCaw and Hughes Electronics but later a subsidiary of McCaw alone. The digital service was made available not only for commercial airlines but for corporate jets as well. The three companies then undertook the next step of linking their systems with satellites to provide worldwide access and making their systems compatible with the European flight telephone system. While doubts remain about the commercial viability of airphone service, the significance lies in the promise of wireless communication that can reach anywhere.46 The development of higher-quality cordless phones in the 900 to 928 MHz range with greater range and clearer sound than 1980s models reinforced McCaw’s vision that wireless was not simply an addendum to wireline transmission but a full-scale complement and competitor.47

The inexorable next step in the drive toward market convergence was the marriage of a major wireless carrier to a major wireline one. Wireline had many more subscribers, but wireless had a far greater rate of growth. The first—and eventually rejected—engagement was Nextel Communications and MCI. Nextel was founded in 1987 by a former F.C.C. attorney and a CPA under the name Fleet Call. Their idea was to establish specialized mobile radio (SMR) services in major metropolitan markets. SMR’s initial use was for dispatch service with which dispatchers are able to maintain contact with drivers in vehicle fleets, such as trucks or taxicabs. Nextel, like McCaw, had plans to use its frequencies for much more than what the F.C.C. initially intended the assigned frequencies to do. After a purchasing spree of dispatch frequencies through the fall of 1993, Nextel was ready to make its biggest move. By acquiring approximately twenty-five hundred frequencies from Motorola in exchange for stock, Nextel had the potential to serve 180 million customers in twenty-one states, including forty-five of the fifty largest cities. At that point, Nextel had access to three times the customers that McCaw had.48

Nextel’s move was based not so much on enlarging its dispatch service, but much more importantly on becoming a full-scale competitor in all wireless services. Its ambition was premised on converting its service to a digital one using Motorola technology and equipment, thereby being able to transmit far more information over each frequency than its analog system could. The green light to fully use SMR frequencies for paging, cellular, and data uses came out of a 1992 General Accounting Office study that concluded the F.C.C.’s duopolistic cellular radio structure provided insufficient competition. Congress reinforced this finding in the Omnibus Budget Reconciliation Act of 1993, which authorized the F.C.C. for the first time to use competitive bidding in the form of an auction to award spectrum licenses. These steps compelled the F.C.C. to allow Nextel and other “wide area” SMR carriers to modify their networks in order to provide a full range of wireless services.49 The inevitable next step in market convergence stemmed from wireless’s increasing ability to compete effectively with wireline services. That was, of course, attempts to form either tight-knit or loose-knit combinations of wireless and wireline companies. In the case of Nextel and MCI, the engagement never turned into a marriage, but in the case of AT&T and McCaw Cellular the sacred rites were performed. Ironically, however, Craig McCaw, after becoming AT&T’s largest shareholder, agreed in 1995 to invest heavily in Nextel.

Why was one marriage consummated but not the other? When MCI announced in March 1994 that it would invest $1.3 billion in Nextel, most analysts viewed the alliance as a necessary defense to the pending AT&T—McCaw Cellular alliance. MCI’s expected acquisition of 17 percent of Nextel’s stock would provide Nextel with an enormous cash investment to fulfill its ambitious plans. MCI would be in a position to remain competitive with AT&T, which contemplated constructing a seamless wireless-wireline network that would take market convergence to a still higher level. Not only would every telecommunications service be provided, the service would be available everywhere. For example, the Nextel acquisition would allow MCI entry into local telephone markets. Comcast Corporation, which offered cable television and cellular service in the Northeast, then owned 17 percent of Nextel, offering the promise of providing entertainment, as well. At the time of the proposed acquisition Nextel already offered combined paging, cellular phone, and dispatch service in Los Angeles and planned to extend such service to other locales soon.50

MCI’s $1.3 billion investment was considered a bargain compared to AT&T’s more than $15 billion acquisition of McCaw Cellular and Lin Broadcasting. But alas, either because of technological difficulties with Nextel’s digital technology, Nextel’s stock price, or the advent of newer and better technologies that we will examine in the next section, MCI canceled its preliminary agreement in September 1994, later pursuing other wireless moves.51 In contrast, the AT&T acquisition of McCaw announced in August 1993 was consummated, notwithstanding difficult legal hurdles stemming from a bitter battle conducted against the merger by AT&T’s former children, the regional Bell operating companies. By late September 1994 the acquisition was complete, with McCaw Cellular becoming the AT&T Wireless Services Division. AT&T president Robert E. Allen emphasized that market convergence was the underlying reason for the acquisition: “By joining forces with McCaw, we are taking a giant step forward in ‘anytime, anywhere’ communications…. Together we will not only satisfy customers’ growing appetite for mobility, we will serve up that mobility with a range of wireless services that goes beyond anything most of us have ever imagined.”52 In October 1996 AT&T was on the way to fulfilling its expectations, unveiling a national digital wireless service combining phone, data, and paging functions in one single piece of equipment. AT&T called its new offering AT&T Digital PCS, but whether it, in fact, deserved to use the acronym PCS—the newest technology—was heatedly contested by its rivals. Enlarging its offerings further, in February 1977 AT&T announced a wireless local phone system using a radio transceiver box attached to internal phone wiring.53

Personal Communications Services (PCS)

Interest in personal communications services (PCS), a term that is sometimes used ambiguously to cover every future wireless technology, was triggered by a 1989 British report that called for the United Kingdom to become the world’s wireless leader. The British government responded to the report by licensing four companies to provide “telepoint services”—essentially cordless pay phones that allow subscribers to originate, but not receive, short-range telephone calls with base stations in areas such as shopping malls and airports. Telepoint service never took off, but its introduction stimulated the F.C.C. to issue experimental licenses based on its promise.54 The agency (as well as new and old industry players) began the PCS inquiry, not on the basis of a new and explicit technology as in the case of cellular radio, but rather on the basis of a set of expectations, or perhaps hopes. For this reason various PCS definitions have been ambiguous and sometimes inconsistent.

Because of this ambiguity, it is best to begin with the expectations to which all interested parties subscribe, rather than the definitions. No one has summed up the expectations better than attorney Thomas A. Monheim:

Imagine having one, permanent telephone number and a small, wireless telephone that you could carry and use everywhere—at home, at the office or in the car. Imagine having a laptop computer that had built-in radio functions and was connected to a wireless local network. Imagine having a sensor in your car that was part of an intelligent vehicle highway system that kept you apprised of traffic conditions and ideal commuting routes. These are just a few of the scenarios that may become everyday reality with the development of advanced personal communications services (PCS).55

One important goal of PCS, then, is to establish a system of person-to-person, in addition to station-to-station, communication—a system of location independence.56 Of course, cellular radio provides some location independence, but as the above quote indicates, considerably more is expected of PCS. Cellular systems, which are analog or mixed analog and digital, have a range of about twenty miles from transmitter stations and operate best in open spaces. In contrast, PCS systems will operate at higher frequencies; will transmit all calls digitally; have (at present) a range of about one thousand feet from a transmitter station; operate equally well in homes, offices, even tunnels; and have consistent wireline voice quality or better. PCS is also expected to solve two of the major interconnected problems that cellular transmission presented, by greatly enlarging the available message capacity and increasing the number of competitors in each market.

One should also note that the future holds in store the prospect of at least two satellite phone services using low-earth-orbit satellites (LEOs) that are now under construction and also promise person-to-person or point-to-point service anywhere on earth. These projects, whose estimated construction costs are about $6 billion each, are Iridium, a consortium of companies led by Motorola, and Globalstar, a consortium led by Qualcomm. At some stage it is anticipated that prices for satellite phone service will fall rapidly and the number of subscribers will commensurately increase.

With this background, we can now examine the definitions of PCS. We begin with the F.C.C.s—a family of mobile or portable radio communications services with the ability to provide services to individuals and businesses virtually anywhere and anytime.57 Accordingly, the broad PCS concept embraces a new generation of cordless telephones, paging services, car telephones, portable facsimile devices, and so on that can send and receive voice, video, data, and any other conceivable kind of information. The F.C.C. began its policymaking in 1990 and by January 1992 instituted a proceeding that allocated 220 MHz of spectrum in the 1850 to 2200 MHz range. Additional frequency allocations occurred, including the 900 MHz band that, as we have seen, has been allocated to cordless phones that can be used for PCS’ and the 2 GHz portion of the spectrum. In 1993 that year’s budget statute called for a competitive spectrum auction process for which the commission had to provide rules in 1994.58 To suggest that the auction process has been a financial success would be an understatement. As the auctions proceeded, the United States government took in more than $1 billion by late September 1996. Notably, the successful bidders included both established companies, such as AT&T and Sprint, and relative newcomers.59

But will the PCS future be fulfilled economically or technologically? As we have seen in every chapter, the number of variables makes it impossible to confidently predict the future. We are always surprised by outcomes as well as winners and losers. We can, however, look at the major difficulties that will be faced. First, cost estimates to build the PCS infrastructure range as high as $25 billion, in no small part because the microcell structure and short transmission range require many more towers than cellular does. But not every technological advance, as we know, meets with consumer acceptance. Will PCS be viewed as just another competitor with cellular and, therefore, get into a bidding war? Or will consumers view it as something significantly different?60

Even the kind of digital technology that will provide the PCS standard has not yet been determined. In one corner are AT&T, Ericsson, and other telecommunications giants who are advancing time division multiple access (TDMA), in which signals from several sources share a single channel by using the channel in discrete time slots when the channel is unused, as we saw in our discussion of CDPD. In the other corner is code division multiple access (CDMA), a more revolutionary technology associated with Qualcomm, Inc., that purportedly offers much greater capacity and security than TDMA. CDMA breaks the data in a call into digital packets and assigns a computer code to each one. These packets are intermingled with calls from other packets and unscrambled at the receiving end. And there are yet other technologies that compete with TDMA and CDMA. The exceptional potential of CDMA to increase capacity and enhance security, as well as successful tests in Korea and Hong Kong, may eventually make it the dominant PCS technology.61 Whatever the outcome, the once sharp division between wire and wireless has now closed. The markets have converged.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.137.210.143