Appendix A. Wireless Protocols

If you’re new to mobile development, the plethora of wireless telephony acronyms can be confusing at first. The good news is that, for the most part, you can ignore them because you don’t know exactly which environment your application will run in. The bad news is that your application should be prepared to run in all of the environments.

To help you follow the debates, standards, and discussions that inevitably arise when discussing cellular and wireless technologies, this appendix introduces the main protocols in historical order.

Prehistory

When mobile phones were first invented in the 1940s, they were just analog radios driven from a car battery. The system was aptly named Mobile Telephone System (MTS), and it was woefully inadequate. In spite of the high cost of service, waiting lists to obtain the service were long because MTS offered only a few channels in any geography. An “improved” version called IMTS, introduced in the 1960s, helped some, but was still far short of the demand.

The first analog cellular radio mobile phone systems started to appear in 1969 and the early 1970s—with phones still the size of a briefcase. The various cellular technologies in North America converged around the Advanced Mobile Phone Service (AMPS) standard, still analog technology but now based on cellular radios that could reuse the frequency spectrum and were standardized across manufacturers. At this time Europe had no less than nine different analog mobile phone technology standards, one for each major region and country in the continent.

The Dawn of Second Generation (2G) Digital Cellular

Roaming in Europe was obviously impossible. Partly to alleviate this problem, the European operators decided to standardize the next generation of mobile phones by forming the European Telecommunications Standards Institute (ETSI). In the early 1980s, ETSI developed a digital mobile phone standard known as GSM (originally Groupe Special Mobile, later Global System for Mobile Communications). The GSM standard included something termed Short Message Service (SMS), which used spare bandwidth on the control channel to send and receive short 160-byte messages.

The GSM system and some other digital cellular standards (such as the digital successor to AMPS in North America, D-AMPS, or IS-54) multiplex different voice callers on a common radio frequency by using time division multiplexing (Time Division Multiple Access, or TDMA). Essentially, the signal from each user is rapidly sampled, and samples from different users are interleaved and broadcast in an assigned time slot. The sampled speech is reassembled at the receiving end of the signal, and in this way multiple users can share a single radio channel.

The cellular protocols are actually quite a bit more complex than this simple explanation would imply. At the same time the radio signal is being sampled and desampled, it is also hopping around to a preset sequence of frequencies, and samples are being reordered in time, all in order to reduce mobile effects such as interference, jitter, dropouts, and multipath distortion.

In the very late 1980s, Qualcomm introduced a new digital system in the U.S. termed CDMA, for Code Division Multiple Access (later also called IS-95 and still later cdmaOne). Instead of dividing each voice signal into time-based divisions, CDMA transmitted all of the signals on multiple radio frequencies at the same time.

But how to keep the signals from interfering with each other? In CDMA, the signals make use of orthogonal “codes” that define which of the frequencies are used for which signal. The signal is transmitted on a number of frequencies defined by the code, and can be extracted on the receiving end by sampling only those frequencies assigned to this particular code. The other signals on those same frequencies are averaged out as noise because they don’t appear consistently in most of the frequencies. CDMA proved to be much more efficient at spectrum use than TDMA, but GSM had already taken hold, and was the more popular standard worldwide.

The 2G mobile protocols were mainly designed for voice, but also provided the first real channels for data. At first the data rates were slow, the coverage spotty, and the technology inefficient in its use of the available bandwidth because it was based on circuit switching. The optimistically named High Speed Circuit Switched Data (HSCSD) system used multiple GSM channels and was rated at 28.8 to 64 kilobits per second, though it rarely achieved even a fraction of that speed. In the 1990s, HSCSD was replaced with the General Packet Radio System (GPRS) standard, the first packet-switched technology for GSM.

Improved Digital Cellular (2.5G)

In the late 1990s, operators could see that demand for voice phones was saturating. They could foresee the day when everybody who wanted a mobile phone would have one. At the same time, the Internet was becoming ubiquitous, and users were starting to demand better data access from their mobile phones. Operators looked for ways to expand the data capacity of their mobile networks while taking advantage of their existing infrastructure investments. GSM operators expanded their GSM/GPRS networks to a new standard called Enhanced Data for GSM Evolution (EDGE), which further improved available data rates and made efficient use of GSM equipment the operators already had installed. CDMA operators capitalized on similar improvements in that domain, with standards such as CDMA2000 1X. The theoretical data rates were now in the hundreds of kilobits per second, though the actual data rates were still much lower. Phones running Android can be expected to have at least 2.5G data connectivity.

A second wave of data access improvement (sometimes referred to as 2.75G) further improved data rates, implemented by High Speed Packet Access (HSPA) for GSM and EV-DO (EVolution Data Optimized, or sometimes translated as EVolution Data Only) for CDMA. Theoretical data rates were now in the multimegabit-per-second range, and most Android phones can be expected to have these technologies, if not 3G.

The Rise of 3G

Also in the 1990s, the European telecom community started defining the next generation of mobile technology, first through ETSI and then through a new organization called 3rd Generation Partnership Program (3GPP). The standard developed by 3GPP is called Universal Mobile Telecommunications Standard (UMTS), and though based fundamentally on Wideband CDMA (WCDMA) technology, was carefully designed to allow both GSM and CDMA operators to evolve their networks efficiently from their installed infrastructure to the new standard. This would allow operators around the world to converge to a new common standard for 3G.[2]

In the early 2000s, operators spent huge sums of money to purchase spectrum for 3G wireless networks. 3G networks are now being deployed worldwide, and over the next few years, new smartphones (including Android-based phones) will all incorporate 3G technologies.

The Future: 4G

So what’s next? The standards bodies are back at work defining the fourth generation of wireless network protocols, sometimes termed LTE (for Long Term Evolution). The apparent winner is a group of protocols called Orthogonal Frequency Division Multiplexing (OFDM), or sometimes OFDMA (the “A” is for Access). These protocols use radio frequency subcarriers to further improve the data rates achievable for wireless devices. Similar protocols are used in the WiMAX standards (the higher bandwidth, longer-range follow-on to WiFi), but it is not clear how WiMAX and LTE will relate to one another.

Just as with 3G, a round of spectrum auctions is starting to take place for 4G, and operators are already investing large sums of money into getting ready for 4G services. Suffice to say that your applications built for Android will someday encounter phones running 4G protocols, and will be able to take advantage of the higher data rates and lower latencies that will come with these protocols.

To wrap up, Figure A-1 shows the evolution of protocols discussed in this chapter in relation to the decade in which they were first deployed and the effective bandwidth they achieve.

Mobile protocols, bandwidth, and dates of deployment
Figure A-1. Mobile protocols, bandwidth, and dates of deployment


[2] Except for operators in the People’s Republic of China, where the government mandated its own version of UMTS, called Time Division-Synchronous Code Division Multiple Access (TD-SCDMA). TD-SCDMA uses TDMA as well as CDMA to provide some unique advantages for data traffic. It also avoids the need for PRC handset makers to pay royalties for most WCDMA intellectual property.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.94.202.151