CHAPTER 16

Mobile Networking

The CompTIA Network+ certification exam expects you to know how to

•   1.5 Compare and contrast the characteristics of network topologies, types and technologies

•   1.6 Given a scenario, implement the appropriate wireless technologies and configurations

•   3.5 Identify policies and best practices

•   4.3 Given a scenario, secure a basic wireless network

•   4.5 Given a scenario, implement network device hardening

To achieve these goals, you must be able to

•   Explain the capabilities of different mobile networking technologies

•   Describe common deployment schemes for mobile devices

•   Deal with sample security issues with mobile devices


Your author, being ancient in computer years, often stares at amazement at his mobile devices in terms of their built-in networking technologies. Every smartphone has cellular; that’s how you make phone calls. The cheapest $35 tablet almost certainly has 802.11 Wi-Fi and Bluetooth, while even a mid-range smartphone contains as many as eight completely different network technologies for communication.

All these technologies require the attention of every security professional, in particular when mobile devices appear in an organization’s infrastructure. The various network technologies give attackers multiple opportunities for wreaking havoc on the devices and maybe even to use those devices to access an organization’s infrastructure. IT security requires good understanding of the network technologies, plus clear methods of managing mobile devices in the organization.

This chapter begins with an overview of all the common network technologies found on mobile devices. From there, the chapter covers several methods of managing and deploying mobile devices in the infrastructure, and then closes with a few common security scenarios that take place and how to deal with them.

Mobile Network Technologies

All of today’s mobile devices are certainly powerful computers, but what makes these devices truly mobile (as opposed to little more than an electronic notebook) is wireless communication. Without the capability to communicate wirelessly with other devices, a mobile device is much less useful. Every mobile device has at least one network technology installed, enabling these devices to connect to other devices, separate networks, or the entire Internet.

Mobile devices incorporate many network technologies, from the common and well known (such as 802.11 Wi-Fi) to the more rare and unique (such as Z-Wave). The proliferation of these network technologies has led to miniaturization and economies of scale, making it simple for even inexpensive devices to incorporate four or more network technologies.


Images

NOTE  The technology advancements used in mobile devices enable almost any type of device to connect to the Internet. Light bulbs, cameras, thermostats—devices that classically would be considered “dumb”—can possess many of the same technologies seen in “smart” devices. Embedded devices that connect to the Internet and enable remote monitoring and controlling are known as Internet of Things (IoT) devices. Given that these devices’ networking capabilities are all but identical to those of mobile devices, they’ll receive occasional mention in this chapter.

Test Specific

Cellular WAN

Anyone with a smartphone these days can enjoy the convenience of using wireless cellular technology on the road. Who doesn’t love firing up an Android phone or iPhone and cruising the Internet from anywhere? As cell-phone technology converges with Internet access technologies, competent techs need to understand what’s happening behind the scenes. That means tackling an alphabet soup of standards.

Regardless of the standard, the voice and data used on smartphones (unless you have 802.11 wireless turned on) moves through a cellular wireless network with towers that cover the world (Figure 16-1). This is the WAN transmission medium.

Images

Figure 16-1 Cellular tower

Mobile data services started in the mid-1980s and, as you might imagine, have gone through a dizzying number of standards and protocols, all of which have been revised, improved, abandoned, and reworked. Instead of trying to advertise these fairly complex and intimidating technologies, the industry instead came up with the marketing term generations, abbreviated by a number followed by the letter G: 2G, 3G, and 4G.

Salespeople and TV commercials use these terms to push mobile cellular services. The generation terms aren’t generally used within the industry, and certainly not at a deeply technical level. As I go through the standards you’ll see on the exam and encounter in real life, I’ll mention both the technical name and the generation where applicable. I’ll cover five common terms here:

•  GSM and EDGE

•  CDMA

•  HSPA+

•  LTE

GSM and EDGE

The Global System for Mobile Communications (GSM), the first group of networking technologies widely applied to mobile devices, relied on a type of time-division multiplexing called time-division multiple access (TDMA). TDMA enabled multiple users to share the same channel more or less at the same time; and in this scenario, the switching from one user to another happened so quickly no one noticed.


Images

NOTE  There’s no “C” on the end of GSM because it originally came from a French term, Groupe Spécial Mobile.

GSM introduced the handy subscriber identity module (SIM) card that is now ubiquitous in smartphones (Figure 16-2). The SIM card identifies the phone, enabling access to the cellular networks, and stores some other information (contents differ according to many factors, none relevant for this discussion).

Images

Figure 16-2 SIM card in phone

The GSM standard was considered a 2G technology. The standard continued to improve over the years, getting new names and better data speeds. One of the last of these (and one you might see on the exam) was Enhanced Data rates for GSM Evolution (EDGE), which offered data speeds up to 384 Kbps.

CDMA

Code-division multiple access (CDMA) came out not long after GSM, but used a spread-spectrum form of transmission that was totally incompatible with GSM’s TDMA. Rather than enabling multiple users to share a single channel by splitting the channel into time slices, spread-spectrum transmission changed the frequencies used by each user.

CDMA was considered superior to GSM, and U.S. carriers adopted CDMA en masse, which created some problems later since the rest of the world went GSM. Plus, CDMA lacked some key features, such as SIM cards.


Images

NOTE  The original CDMA was considered a 2G technology.

HSPA+

In the late 1990s the International Telecommunication Union (ITU) forwarded a standard called International Mobile Telecommunications-2000 (IMT-2000) to address shortcomings in mobile technology. IMT-2000 defined higher speeds, support for full-time Internet connections, and other critical functions. The standard pushed support for multimedia messaging system (MMS) (so you can send cat pictures in your text messages) and IP-based telephony.

Both GSM and CDMA improved during the late 1990s to the mid-2000s to address IMT-2000: all these improvements were marketed under probably the most confusing marketing term ever used: 3G. Ideally, 3G meant a technology that supported IMT-2000, although the industry was very lax in how companies used this term. (This time period is so confusing that many technologies in this period were given decimal generations to clarify the situation. One example is GSM EDGE being called 2.9G due to its lack of full IMT-2000 support.)

Evolved High-Speed Packet Access (HSPA+) was the final 3G data standard, providing theoretical speeds up to 168 Mbps, although most HSPA+ implementations rarely exceeded 10 Mbps. (Note that the CompTIA Network+ objectives list an earlier version, High-Speed Packet Access [HSPA]).

LTE

Devices and networks using Long Term Evolution (LTE) technology rolled out world-wide in the early 2010s and now dominate wireless services. Marketed as and now generally accepted as a true 4G technology, LTE networks feature speeds of up to 300 Mbps download and 75 Mbps upload. All LTE services use SIM cards such as the one shown in Figure 16-3. Note the SIM size in Figure 16-3 compared to the much older SIM in Figure 16-2. The much smaller SIM in Figure 16-3 is a nano-SIM. The SIM in Figure 16-2 is an original, standard SIM.

Images

Figure 16-3 nano-SIM

Smartphones have LTE radios built in, but it’s easy to add LTE to almost any device. Need LTE on a laptop or a desktop? No problem, get an LTE NIC and just plug it into a convenient USB port (Figure 16-4).

Images

Figure 16-4 Cellular wireless NIC on USB stick

802.11

Chapter 14 covered the 802.11 wireless standard in exhaustive detail, so there’s no reason to repeat those details here. Be aware that any 802.11-capable device, even a simple IoT thermostat, suffers from all the same configuration issues discussed in Chapter 14.

There are a few aspects of how mobile devices, specifically smartphones, use 802.11 that can create issues in some situations. One issue is a wireless hotspot—a feature that enables a device to share its cellular WAN connection by setting up an SSID via its own 802.11 wireless. Hotspots can be dedicated devices, or simply a feature of a modern smartphone (Figure 16-5).

Images

Figure 16-05 Setting up hotspot on an Android phone

If you’re one of those folks who uses a lot of data on your smartphone, it’s generally a good idea to use 802.11 when available to avoid cellular WAN data use charges, but you really must be careful. Let’s explore the benefits and pitfalls in more detail.

Every mobile OS maker provides features to help phones use 802.11 networks when available. These features vary dramatically between Android and iOS. Even on Android devices, the service provider might make changes to the interface as well, adding even more variance. Figure 16-6 shows some of these settings on an Android phone.

Images

Figure 16-6 Automatic 802.11 network connection options on Android

These settings, while convenient, create serious security concerns. Notice the option to connect to open (no password) networks. Open (also known as public since anyone may access them) wireless networks are very risky. Bad actors constantly monitor coffee shops, airports, and other public Wi-Fi networks, looking for ways to capture personal information. This is such a big problem today that many experts strongly recommend that you never use public Wi-Fi.

I disagree. I think you can use public Wi-Fi safely, but only if you’re a wise user. Make sure to run HTTPS on all your browsers and watch for screens that don’t make sense (like asking you to log in with some account that has nothing to do with what you’re doing at that moment online). Most mobile operating systems provide special functions, such as requiring HTTPS Web pages or VPNs for any public connection.

Saved wireless profiles are also an issue in some cases. For example, many ISPs provide a standard wireless name for public access of all their access points (Xfinity and AT&T Wi-Fi are two examples). These SSIDs aren’t really public, as they require logons, but attackers often create spoofed access points to get users to log in using their ISP credentials (Figure 16-7).

Images

Figure 16-7 Fake Xfinity access page

Bluetooth

Bluetooth, that wonderful technology that connects headsets to smartphones and keyboards to laptops, is a wireless networking protocol, similar to 802.11, developed to simplify connections between devices. You bring two Bluetooth-aware devices into close proximity, perform a quick pairing between the two, and they connect and work. Connected Bluetooth devices create a personal area network (PAN), and security concerns apply to all the devices and communication within that PAN.

Bluetooth has two important tools to make using it more secure. First, all of today’s Bluetooth devices are not visible unless they are manually set to discovery or discoverable mode. If you have a Bluetooth headset and you want to pair it to your smartphone, there is some place on the phone where you turn on this mode, as shown in Figure 16-8.

Images

Figure 16-8 Setting up discoverable mode

The beauty of discoverable mode is that you have a limited amount of time to make the connection, usually two minutes, before the device automatically turns it off and once again the device is no longer visible. Granted, there is radio communication between the two paired devices that can be accessed by powerful, sophisticated Bluetooth sniffers, but these sniffers are expensive and difficult to purchase, and are only of interest to law enforcement and very nerdy people.

The second tool is the requirement of using a four-digit personal identification number (PIN) during the pairing process. When one device initiates a discoverable Bluetooth device, the other device sees the request and generates a four-digit PIN. The first device must then enter that code and the pairing takes place.

The problem with these two security tools is that they weren’t required on the early generations of Bluetooth. From the time Bluetooth devices started coming out in the late 1990s up until around 2005, a number of devices skipped the discovery mode—basically just always staying in discoverable mode—or skipped the use of PINs. This foolish choice resulted in several Bluetooth attacks, two of which might show up on the CompTIA Network+ exam: Bluejacking and Bluesnarfing.

Bluejacking was the process of sending unsolicited messages to another Bluetooth device. These devices would pop up on your screen. Bluejacking wasn’t considered anything more than irritating, but Bluesnarfing was another matter. Bluesnarfing used weaknesses in the Bluetooth standard to steal information from other Bluetooth devices. Simply adding proper use of discoverable mode as well as PINs initially stopped these attacks.

Over the years, bad actors have continued to come up with new Bluesnarfing attacks, in response to which the Bluetooth standard has been patched and updated several times. One great example of these updates was making all Bluetooth communication encrypted in Bluetooth version 4.0. This attack–patching–updating cycle continues to this day.

Assuming all your devices support Bluetooth 4.0, your Bluetooth devices are safe and robust out of the box against most attacks. But there’s no such thing as a guarantee in the world of IT security. Bluetooth is a simpler technology and it’s not designed to defend against aggressive, sophisticated attacks. Therefore, the only way to ensure Bluetooth security in a high-security environment is to turn it off (Figure 16-9).

Images

Figure 16-9 Turning off Bluetooth


Images

NOTE  A very notable exception to the safeness (is that a word?) of Bluetooth devices is a set of security vulnerabilities published in late 2017 called BlueBorne. The vulnerabilities were discovered by a security company called Armis; they potentially affect more than 8 billion devices, from smartphones to tablets to cars. You won’t see BlueBorne on the CompTIA Network+ exam, but you should pay attention to the tech news as this could seriously impact you and your clients in the future.

Less Common Mobile Network Technologies

Cellular WANs, 802.11, and Bluetooth are easily the dominant mobile network technologies, but there are a few alternatives that, while not as commonly used, are supported in many mobile devices. This section explores NFC, RFID, infrared, ANT+, Z-Wave, and Zigbee.

NFC

Near field communication (NFC) is a low-speed, short-range technology designed for (among other things) small-value monetary transactions. Most NFC connections consist of an NFC tag that stores private data (Figure 16-10) and some kind of NFC reader. The NFC tag is an unpowered, passive device. Bringing the NFC reader in very close proximity to the NFC tag creates an electromagnetic induction in the NFC tag, enabling the two devices to communicate using only power from the reader. This is very handy for NFC-enabled credit cards and other devices that would otherwise need batteries.

Images

Figure 16-10 NFC tag


Images

NOTE  The fact that NFC tags are passive has a downside. A bad actor with the right reader can read every NFC-enabled card in anyone’s wallet or purse, making a whole new market for metal-shielded card holders that can block a reader.

While NFC contains basic encryption, the technology relies on the short distances (<5 cm) and quick communications (hard to target with a man-in-the-middle attack) to provide security. These are acceptable when using NFC for simple data transfers, as shown in Figure 16-11, but are unacceptable for monetary transfers, because bad actors may gain access to personal information.

Images

Figure 16-11 Transferring a Web URL between two Android phones

All the popular “Tap to Pay” services such as Apple Pay and Android Pay use NFC, but add their own security protocols on top of NFC to ensure the privacy of the transaction.

RFID

Radio-frequency identification (RFID) uses the same concepts of tag and reader as NFC, but has several differences. First, RFID isn’t a single standard as much as it is many standards that use the technology.

Second, different RFID standards use diverse frequencies and unique protocols. While NFC uses the 13.56-MHz band, the different RFID standards span at least six different radio bands, from as low as 120 KHz to as high as 10 GHz. RFID needs these different bandwidths to support the many different applications and environments … which brings up the next point.

Third, RFID has much broader applications, from labeling individual parts in manufacturing, to identifying the different parts of a deep undersea structure, to labeling your pet with an injectable RFID tag. Most RFID devices have some common features:

•  Close proximity   Devices generally communicate within 1 meter.

•  Security   Most RFID standards include weak or no encryption.

•  Labels   RFID tags most commonly appear inside labels, which are in turn placed on whatever component (box, part, device) needs to be queried.

RFID tags are much easier to read than NFC tags; a bad actor can simply walk near the RFID tag with the right reader (as opposed to getting within 5 cm). Therefore, RFID tags tend not to hold personally identifiable information (PII).

Infrared

Infrared (IR) communication consists of a device that emits coded infrared light and a second device that reads the code and acts upon it. The most common infrared devices are the remote controls that come with TVs, tuners, and other electronic devices.


Images

NOTE  Of all the technologies discussed here, only infrared doesn’t use radio waves.

Your author loves his Android phones if for no other reason than only Androids (though not all of them) come with an infrared transmitter known generically as an IR blaster. IR blasters can emulate any IR remote control, turning any Android into a universal remote. Since these devices only transmit simple remote-control commands, there is no risk to your mobile device. The risk is that any IR blaster can quickly take complete control of any device designed to receive those commands, making the Android the bad actor. The only real defense is to block the IR receivers.


Images

EXAM TIP  Wireless network technologies such as Bluetooth, NFC, and IR are designed to make a single point-to-point connection at very short ranges. Bluetooth may go as far as 100 meters for the most powerful Bluetooth connections, for example; NFC devices drop that to as little as 5 cm. Any connections used by Bluetooth, NFC, or infrared are called personal area networks (PANs)—as mentioned earlier—to reflect their shorter distances and point-to-point nature.

ANT+

Adaptive Network Technology (ANT+) is a low-speed, low-power networking technology that, similarly to NFC, consists of a passive ANT device and an ANT reader. ANT was developed to fix a problem with alternatives like 802.11 or Bluetooth. 802.11 and Bluetooth are powerful wireless tools, but both of these technologies require rather chatty, relatively power-hungry devices. Bluetooth isn’t nearly as power hungry or as fast as Wi-Fi, but when you have very small devices such as heart-rate monitors, even Bluetooth uses too much power.

Garmin introduced the proprietary ANT protocol around 2007 for low-power sensors that don’t send a lot of data and that often go for extended periods of time without transferring data (like a heart-rate monitor that’s used at most once a week). You’ll also see ANT-capable exercise bicycles and treadmills. (Yes, ANT is popular in health clubs!)

ANT isn’t as common as many other network technologies. Apple devices lack an ANT radio; roughly two-thirds of Android devices have them. If you’re using, for example, an ANT heart-rate monitor and you want to use it with your iPhone, you need an ANT USB dongle to talk to your ANT device.

The latest version of ANT is called ANT+. ANT/ANT+ is a superb protocol that’s incredibly low powered and has enough bandwidth for many devices. ANT is encrypted with AES, so hacking isn’t an issue (yet).

Z-Wave and Zigbee

Home automation—the process of controlling lights, thermostats, cameras, even a washer and dryer remotely—is an ever-growing industry. While wired home automation has been around for a while, connecting smart devices wirelessly is a much newer concept.

Using wireless technology for home automation has many challenges. First are the huge number of IoT devices that a modern home might potentially use, from thermostats to washing machines to power outlets and light bulbs. Secondly, homes, unlike offices, are filled with small rooms, narrow staircases, and other obstacles that make regular radio-based wireless difficult. Yet demand for home automation is strong, and two competing technologies, Z-Wave and Zigbee, are in direct, head-to-head competition in wireless home automation. Z-Wave is a proprietary standard (with an open API for programmers), while Zigbee is a completely open standard. Both use a mesh networking topology to facilitate communication in homes, yet both also have hubs that act as the network interconnect.

Deployment Models

Organizations that choose to use mobile devices must create some methods of getting those devices into their employees’ hands. This isn’t a matter of simply purchasing devices for each person. Users have strong opinions about their devices; the whole Android versus iOS debate alone is a huge issue. To deal with these challenges, there are specific deployment models to help define how organizations approach the use of mobile devices by employees. This section explores BYOD, COBO, COPE, and CYOD, then looks at general on-boarding and off-boarding procedures.


Images

EXAM TIP  The CompTIA Network+ objectives mention only the BYOD deployment model, not COBO, COPE, or CYOD, so expect a question or two on BYOD. I’ve included the other models here because they’re common in the real world.

BYOD

Bring your own device (BYOD) deployment uses employees’ existing mobile devices for use by the corporation. The company installs their applications on employees’ mobile devices. Employees install their corporate e-mail accounts on their personal devices. BYOD requires tight controls and separation of employee personal data from corporate data.

COBO

In a corporate-owned business only (COBO) deployment model, the corporation owns all the devices. The corporation issues mobile devices to employees. The corporation is solely responsible for the maintenance of the devices, the applications, and the data.

COBO is very rigid—nothing but company-approved software is used on the issued mobile devices. This is often a challenge as it requires employees to carry both the corporate device and their own device.

COPE

Corporate-owned, personally-enabled (COPE) is almost identical to COBO in that the organization issues mobile devices. With COPE, however, employees are presented with a whitelist of preapproved applications that they may install.

CYOD

An organization offering choose your own device (CYOD) options provides employees free choice within a catalog of mobile devices. The organization retains complete control and ownership over the mobile devices, although the employees can install their own apps on the mobile devices.

On-Boarding and Off-Boarding

Regardless of the deployment model that an organization chooses, it needs some method of verifying that new mobile devices appearing in the organization’s infrastructure are secure and safe to use within the organization. This procedure is known as on-boarding. As mobile devices leave the control of the organization, those same devices must be confirmed to no longer store any proprietary applications or data, a process called off-boarding.

On-Boarding

On-boarding takes place in several ways, but one common practice is to require any previously unfamiliar mobile device accessing an internal 802.11 network to go through a series of checks and scans. While exactly which checks and scans take place varies depending on the deployment model, a typical BYOD on-boarding process may include the following:

•  A sign-on page requiring a user name and password

•  An authorization page describing all applicable policies to which the user must agree

•  A malware scan of the device

•  An application scan of the device

Off-Boarding

Mobile devices move out of organizations just as often as they move in, for various reasons. A person quits and intends to take their phone with them. Mobile devices get old and are due for replacement. A device that uses tablets for controls switches from iPads to Android tablets. Whatever the case, it’s important that every outgoing device goes through an inspection (sometimes manual, sometimes automatic) that deletes proprietary applications as well as any data that the organization doesn’t want out in the world.

Scenarios

Mobile networking technologies present organizations with several opportunities to enhance IT security beyond what is possible with non-mobile networking technologies. Let’s look at three networking technologies unique to mobile networks and devices and IoT devices and see how organizations can leverage them to enhance IT security:

•  Geofencing

•  Locating and disabling lost mobile devices

•  Hardening IoT devices

Geofencing

Geofencing is the process of using a mobile device’s built-in GPS capabilities and mobile networking capabilities to set geographical constraints on where the mobile device can be used. Geofencing works in several ways, and the CompTIA Network+ exam objectives certainly use “given a scenario” language when mentioning geofencing in the objectives.

An example scenario is that an organization uses very high-security mobile devices that must never be taken outside the large campus. The organization installs a geofencing application on each device. If any device goes outside very specific GPS ranges, the device will notify the administrator via one or more methods.


Images

NOTE  Geofencing is often used in anti-theft applications that will also silently turn on cameras and microphones as well as lock the system.

Locating and Disabling Lost Mobile Devices

Losing a mobile device by accident or theft is never fun, but losing a corporate device, potentially filled with private and proprietary information, is an event that requires quick action on the part of the organization. Generally, the organization should follow at least the following steps if a mobile device is lost or stolen:

1. Report the loss and possible data breach to the appropriate person or group in the organization.

2. Locate and attempt to recover the device if it has a functional tool enabled, such as Find My iPhone.

3. Remote wipe the device if there is a functional tool.

4. Define any encryption on the device so that any old keys, signatures, or certificates are marked as lost/stolen.

5. Disable all accounts associated with the phone (carrier, apps, sign-in, etc.).

6. Reissue new device with all apps installed and generate new keys for any encryption (for example, BitLocker).

Try This!

Location Services

It might come as a surprise, but many folks leave their mobile devices … some-where and then frantically search for them an hour or two later. Typically, the device is found in a car or backpack, and moment of panic is over. If you don’t discover the missing device during a quick search, however, you can turn to the location services most devices have to try to locate it.

That begs the question, how good are these location services today? Will Apple’s Find My iPhone tell you which room in your house you left the iPhone? How about Android or Windows Phone services? I don’t have the answers, because these services are improving all the time. So try this!

Hand your phone or other mobile device to a family member or very trusted friend. Try different locations, both near and far from the location of the com-puter you’ll use to log into the location services (such as iCloud for Apple iOS devices, or Google or Microsoft accounts for their respective devices). Do the services just (approximately) locate your device, or can you remotely lock it too?

Hardening IoT Devices

Hardening IoT devices decreases the danger of loss or downtime on the devices and increases the protection of personal information and company data. Generally, hardening means to keep the devices current (software and firmware), use physical security precautions, and apply internal security options.

Consider a scenario in which an organization uses many 802.11 PTZ (pan/tilt/zoom) cameras to monitor secure areas throughout three locations. These cameras are on the one and only SSID in each location. Each SSID uses WPA2 PSK encryption. Due to location, these cameras must use 802.11 for communication. All cameras must be accessible not only in each location but also in headquarters. Your job as a consultant is to provide a list of actions the organization should take to harden these cameras. Here’s a list of actions you should consider:

•  Place all cameras in their own SSID.

•  Put all the camera feeds on their own VLAN.

•  Use a very long PSK.

•  Set up routine queries for camera firmware updates from the manufacturer.

•  Use user name ACLs to determine who may access the cameras.

Chapter Review

Questions

1. Which cellular WAN technology introduced the concept of the SIM card?

A. CDMA

B. GSM

C. EDGE

D. LTE

2. GSM, EDGE, and LTE all use which of the following? (Select two.)

A. SIM cards

B. CDMA

C. TDMA

D. NFC

3. A thermostat that you can control remotely with an app on your smartphone would certainly fit under which of the following?

A. TDMA

B. BYOD

C. Bluetooth

D. IoT

4. You can reduce the vulnerability of your cell phone when automatically connecting to open SSIDs by:

A. Requiring HTTPS

B. Disabling WPA

C. Forgetting all wireless networks

D. Requiring SSH

5. Which of the following does a classic hotspot require? (Select two.)

A. Bluetooth

B. 802.11

C. Cellular WAN

D. NFC

6. In order to pair to another Bluetooth device, it must be set into _______________ mode.

A. PIN

B. Discoverable

C. Master

D. Pair

7. A Bluetooth PIN code is at least _______________ digits long.

A. 3

B. 4

C. 6

D. 8

8. NFC tags are always:

A. Passive

B. Encoded

C. Spherical

D. Magnetic

9. All tap-to-pay services use which networking technology?

A. 802.11

B. Bluetooth

C. ANT+

D. NFC

10. A TV remote control most likely uses which of the following network technologies?

A. RFID

B. NFC

C. Infrared

D. Bluetooth

11. Which of the following probably uses the least power?

A. ANT+

B. RFID

C. Bluetooth

D. 802.11

12. In which deployment model does the company own all devices, issue whichever device it chooses to a given employee, and retain control of which apps are installed?

A. BYOD

B. COBO

C. COPE

D. CYOD

Answers

1. B. Global System for Mobile (GSM) introduced the SIM card.

2. A, C. GSM, Enhanced Data rates for GSM Evolution (EDGE), and Long Term Evolution (LTE) all use SIM cards and time-division multiple access (TDMA).

3. D. Many devices in the smart home are part of the Internet of Things.

4. A. Requiring HTTPS can reduce the vulnerability of cell phones when connecting to open wireless networks.

5. B, C. Classic hotspots require both 802.11 and cellular WAN access.

6. B. Bluetooth devices need to be in discoverable mode to pair.

7. B. Bluetooth PINs are four digits long.

8. A. Near field communication (NFC) tags are always passive.

9. D. Tap-to-pay services use NFC.

10. C. TV remotes typically use infrared technology.

11. A. ANT+ is miserly with power use.

12. B. In the corporate-owned business-only (COBO) model, the company owns and controls all devices absolutely.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.52.86