Chapter 3

Power Management

Introduction

Bluetooth technology finally makes the mobile application a reality. Not only can users be mobile whilst connected but radio networks can also be used in places where fixed infrastructure is too expensive, dangerous, or difficult to deploy. This, however, leaves you with the difficulty that all these devices must be powered using batteries, which have to be frequently recharged or replaced. If the Bluetooth device uses too much power, this can become a real problem.

As an applications designer, you may think there is nothing you can do about the problem—after all, you have no control over the amount of power your hardware consumes. The good news for Bluetooth applications is that designers do have the ability to do something about improving the power efficiency of their application. The Bluetooth specification offers a range of power-saving features, tailored to suit the needs of different applications, which can give your applications a real edge.

The drawback (and there always is one) is that if you use these features badly, you will slow down the response time of your application, making it infuriating to use. This chapter will tell you how to get the best of both worlds: save power while still producing usable applications.

Using Power Management: When and Why Is It Necessary?

Before going further, its worth spending a little time defining what a power managed application actually is and exploring some of the reasons why such applications are necessary. A power-managed application is one that allows the device it is running on to go into sleep mode for significant portions of its duty cycle. Sleep mode need not involve powering down the whole device; in fact, this is highly unlikely, as certain functional blocks will always need to be powered. However, when a device is in sleep mode it should be consuming significantly less power than when it is fully “awake,” otherwise power management will be a waste of time.

A further characteristic of application level power management is that it should not adversely affect the performance of the application. In fact, the user should not be aware that your application is using power management and that the Bluetooth device is not constantly powered on. Powering down a device at the wrong time can not only result in almost no energy being saved, but it can also make an application virtually unusable by making it slow to respond. Let’s consider the example of a wireless headset and a mobile phone. If the headset is powered down at the wrong time, the phone will not be able to notify it of an incoming call. Even though the headset may be saving significant amounts of power, as far as the user is concerned, it is unusable, because it cannot receive calls in a timely manner.

So, if power management has the potential to make your application unusable or infuriatingly slow, why bother with it? Used in the correct way, the Bluetooth power management modes have the potential to extend the battery life of your device significantly, yet be completely transparent to the user. In general, users do not like having to lug about heavy batteries or recharge their devices frequently. A typical mobile phone has a small battery and yet can last several days without recharging. If adding Bluetooth functionality to such a phone reduces its average battery life significantly, it is unlikely to be popular with the user. Power management at both the hardware and software levels of Bluetooth technology is therefore necessary in order to make these networks viable. A further benefit of application power management is that the energy savings are independent of the underlying technology. This means that if through power management you double the battery life of your device, this will hold true even if the power consumption of the underlying hardware was significantly improved.

A relatively minor, but nevertheless important, point to consider is who owns the devices that are being power managed. Often greater power savings can be achieved by one device at the expense of the energy resources of another. An obvious example would be where a device is powered down for the majority of its duty cycle while another device buffers packets destined for it and therefore must be constantly powered on. Periodically, the first device wakes up to pick up these packets, acts on them if necessary and then powers down again. Thus, the first device can achieve very high power savings at the expense of the buffering device. If the same user owns both devices (and especially if one of those devices can be mains powered, e.g., a PC) then this is a very good approach to achieving high power savings. However, if the devices belong to different users then there is an obvious conflict of interests as both users might be keen to prolong the battery life of their particular device rather than altruistically providing a service for others. In this case, a scheme where both devices achieve some, but not maximal, power savings may be a better compromise rather than having no power saving at all. The anticipated uses of a power-managed application can therefore be important in choosing the power management approach taken.

Having discussed how useful power-managed applications can be, it is worth looking at what types of applications are suitable for these techniques and which ones will have their performance adversely affected by power management. The first thing to remember is that in order to save power, the device must be put into sleep mode. Applications that require large amounts of data to be sent or received, or that need very fast response times, are not suitable for power management. On the other hand, applications requiring small amounts of data to be transmitted or where data transfers are infrequent are very well-suited to being powered down for the majority of the time they are inactive. Similarly, applications where a delay in the response time can be tolerated should also consider power management.

Before choosing a given Bluetooth power management mode to use with your application you should consider the maximum amount of time the device can be powered down without adversely affecting the performance of your application. In general, when using power management, an application designer trades off an increase in latency and a decrease in data throughput for an increase in the battery life of the device running the application. The following sections will discuss the Bluetooth power management modes and the use of each mode in the context of different types of applications.

Investigating Bluetooth Power Modes

For most applications, if a connection exists between two or more Bluetooth-enabled devices, one of the Bluetooth low power modes can be used to extend the battery life of either some or all of these devices. In fact, power-managed devices can be in one of four states, listed in order of decreasing power consumption: active, hold, sniff, and park mode. Each of these low power modes will be described, along with a discussion of what type of applications will and will not be suitable for it.

Active Mode

In active mode, the device actively participates on the radio channel. The master schedules data transmissions as necessary and the slaves must listen to all active master-slave slots for packets that may be destined for them. This mode is a useful benchmark for comparison with the performance of the low power modes since it not only consumes the most power but also has the highest achievable data throughput due to the devices being able to use all available slots. The power consumption of Bluetooth devices is highly dependent on the manufacturer of the device and the application that it is running. Furthermore, as the technology matures, the power consumption of Bluetooth-enabled devices will improve further and hence it is best to compare low power modes relative to the active mode.

We will briefly discuss the type of applications best suited to active mode, which are unlikely to benefit or be able to utilize any of the other low power modes. An application that has very high data rate requirements is unlikely to power save as it will need to have its radio transceiver powered on for the majority of its duty cycle. Similarly, applications that require very low latencies are also unlikely to be able to use the low power modes since they will power down for such short periods that the overhead in powering down the device will be greater than the energy saving made (or powering down for longer periods will mean the application is no longer able to conform to its latency requirements).

Hold Mode

This is the simplest of the Bluetooth low power modes. The master and slave negotiate the duration that the slave device will be in hold mode for. Once a connection is in hold mode, it does not support data packets on that connection and can either power save or participate in another piconet. It is important to note that the hold period is negotiated each time hold mode is entered. Figure 3.1 shows what the interaction between two devices using hold mode might look like. A further important aspect of hold mode is that once it has been entered, it cannot be cancelled and the hold period must expire before communications can be resumed.

image

Figure 3.1 Hold Mode Interaction

Given these constraints, what type of application would benefit from using hold mode? If your application can determine or control the time of its next data transmission, then it can most probably use hold mode for power management. One example of an application that has some degree of control over when its next data transmission should take place is a wireless e-mail delivery system. E-mail is not a synchronous communications medium and messages can take anything from a few seconds to several hours to be delivered to their destination. More importantly, users do not perceive e-mail delivery to be instantaneous and hence would tolerate a small additional delay in favor of extending the battery life of their device. The following sidebar, “Power Management Using Hold Mode,” discusses in more detail how hold mode can be used by such an application, along with power saving techniques available.

image Developing & Deploying …

Power Management Using Hold Mode

Given that e-mail is not an instantaneous communications medium and the delivery delays involved can be relatively large, any wireless e-mail delivery system has a lot of flexibility in the way it checks for new messages and sends off ones that have just been written. In fact, if correctly implemented, the delivery delay should not be perceptible to the user.

Let’s assume we have a Bluetooth-enabled organizer that periodically communicates with an access point and retrieves newly arrived e-mails as well as sending off ones that have just been written. A simple way of implementing such a service will be to set up an RFCOMM connection between the two devices and have the checking device periodically search for new e-mails. Placing such a link in hold mode is unlikely to have a significant impact on the delivery time of e-mail and can result in power savings at both ends of the link. Furthermore, as each hold interval is negotiated independently of the previous ones, this gives us the opportunity to write an application that dynamically adapts to its usage. For example, successive hold intervals can be increased by a certain factor (up to a particular ceiling, of course) if there are no e-mails retrieved or sent during the previous “active” period. In the same way, successive hold intervals can be decreased if the frequency of e-mail arrivals increases. This approach allows the application to better adapt to the way it is being used and achieve higher power savings when the load on the radio is light whilst still being responsive at higher usage rates. However, designers of such applications should be careful not to make such transitions too rapid as this may result in a yo-yo effect with the application swinging from one extreme to the other.

A further power saving technique at the application level, not directly connected with the use of Bluetooth low power modes, may be to compress data before transmitting it. If a high enough compression ratio can be achieved, the time that the transceiver has to be powered on can be reduced enough to justify the extra work. However, this should also be used with caution. A small device with relatively little computation power will use up energy in compressing (or decompressing) a file and this may offset the savings made in transmitting a smaller file. Such power-saving techniques are highly dependent, not only on the type of data being sent, but also on the underlying hardware.

A very different candidate for hold mode is one which relies on the use of a SCO link and does not need to send data packets. Furthermore, if the application can tolerate a poorer audio quality it can use fewer slots and hence power down for longer periods of time. For example, a baby monitor needs to have an active SCO link but does not need the ACL link. Also, given that parents are mainly interested in detecting whether the baby is crying or not, this application could probably get away with a slightly poorer quality of audio. By placing the ACL link in hold mode for relatively long periods of time and reducing the quality of the SCO link, the application can achieve greater power savings.

Having discussed application types able to benefit from using hold mode, we will briefly consider applications that should not use this mode, being it’s likely to have a negative impact on performance. Hold mode is not suitable for applications whose traffic pattern is unpredictable and which cannot tolerate unbounded communication latencies. An obvious example is a device that allows a user to browse the Web over a wireless link. Even though access to the World Wide Web is notorious for being slow, if this latency is further increased by using hold mode, the application becomes too frustrating to use. At this point, it’s worth remembering that once entered, hold mode cannot be exited until the negotiated hold interval has expired. Furthermore, the traffic pattern of such an application is impossible to predict due to the nature of Web browsing. The user may make a number of page requests in quick succession whilst browsing for a particular page. However, once the page has been found, they may spend considerably longer looking at the page and not need the use of the wireless link for some time.

A very different application type whose performance will be negatively impacted is a network of sensors which need timely delivery of their data—for instance, intruder detection. Once a sensor has been triggered, fast delivery of this information to the control center is imperative. A sensor with a long battery life that spends much of its day powered down may just give an intruder time enough to avoid being caught.

Sniff Mode

This low power mode achieves power savings by reducing the number of slots in which a master can start a data transmission and correspondingly reducing the slots in which the slaves must listen. The time interval, Tsniff, between the slots when a master can start transmitting is negotiated between the master and slave when sniff mode is entered. When the slave listens on the channel it does so for Nsnff attempt slots and can then power down until the end of the current sniff interval. The time of reception of the last data packet destined for the slave is important, as the slave must listen for at least Nsniff timeout after the last packet is received.

Figure 3.2(A) shows the lower bound of the number of slots that the slave must listen. In this case it just listens for Nsniff attempt. This happens if the last packet for the slave is received when there are more than Nsniff timeout slots remaining in the sniff attempt. The slave just listens for the remainder of the sniff attempt interval and can then power down.

image

Figure 3.2 Sniff Mode Interaction

Conversely, Figure 3.2(B) shows a slave listening for an extended period. In this case the slave listens Nsniff attempt, then receives a packet and listens for a further Nsniff timeout slots. This shows how the slave must listen for a further Nsniff timeout slots if the last packet is received when there are less than Nsniff timeout slots left in its sniff attempt interval. If the slave continued receiving packets it would continue listening for Nsniff timeout slots after the last packet is received, so if the master kept on transmitting the slave would remain continuously active.

The slave can vary its activity from just Nsnff attempt slots thru (Nsnff attempt + Nsniff timeout) slots, and even go all the way to continuously active, all without renegotiating any parameters. You can therefore see that by choosing suitable values for the sniff interval and the number of slots that the slave listens for, power savings can be achieved without adversely affecting the performance of the application.

This section will consider what types of applications are suitable for use with sniff mode and which are not. Sniff mode is more flexible than hold mode since either the master or the slave can request for sniff mode to be exited. However, there is a trade off in the overhead associated with exiting sniff mode and it is more advantageous to choose the sniff mode parameters so as to minimize the likelihood of exit. Since sniff mode requires the slave device to periodically wake-up and listen to the radio channel, it is particularly well-suited to applications where devices regularly transmit (or receive) data. An example of such an application is discussed in the case study that follows. Sniff mode can also be used when there is an active SCO link. Once again, by accepting a slight degradation in the audio quality, power savings can be achieved since SCO links using HV2 or HV3 packets can be placed into sniff mode (note that SCO links using HV1 packets can also be placed into sniff, but in this case it will not have much effect since the device is transmitting in every slot).

Another set of applications that could use sniff mode are ones where the devices can aggregate data and maybe even do a limited amount of processing before communicating with the master. Thus, not only the frequency of communication can be reduced, but also the actual amount of data transmitted. Once again, sensor networks are an obvious area of application. For example, a traffic monitoring system would be wasting resources transmitting every second the number of cars that have passed through a given point. Since the information is not time-critical, the update frequency can be decreased (i.e., the car count is aggregated at the sensor without affecting the performance of the system). However, this need not be limited to sensor applications—for example, the e-mail delivery system described in the previous example could be implemented using sniff mode instead of hold mode.

Application types not particularly well-suited to using sniff mode are ones frequently requiring relatively large data transfers. In this case, the time necessary to transmit the data is important, because if it takes too much time, your application will not be able to power down for very long, if at all. The application itself will not see a degradation in performance, but it will not achieve any power savings either.

image Developing & Deploying …

Power-Managed Sensor Networks

One application that Bluetooth seems particularly well-suited for is sensor networks. As the technology matures, single chip Bluetooth solutions will not only become smaller but also much cheaper, making it feasible to embed them into even the cheapest devices. The number of possible sensor applications is virtually infinite. For this example, we shall consider what a patient monitoring system in a hospital might do and how it can benefit from using sniff mode to prolong the battery life of its sensors. Currently, remote monitoring of patients is limited mostly to intensive care wards and usually only one or two of the patient’s vital life signs are monitored. The main reason behind this is that once this information has been collected, it is difficult to disseminate it so that both doctors and nurses have easy access to it. By using wireless sensors, the collected information can be periodically transmitted to a wireless access point and from there stored centrally so it can be accessed from anywhere in the hospital, or even from outside it (e.g., a consultant logging in from home to check up on a patient).

One such system might involve a set of sensors such as heart rate, blood pressure, temperature, and respiration monitors that frequently transmit their readings to a central access point in the ward. This information could then be displayed at the nurses’ station so that patients are monitored continuously. In addition, doctors would be able to access the same information from anywhere in the hospital or even from home using their own Bluetooth-enabled organizer and hence be able to react quickly to changes in the patient’s condition. To save power, the sensors use sniff mode and during the listen slots are addressed by the access point and transmit their readings. The sensor can then power down for the remainder of the sniff interval. This solution has great power-saving potential, but there is one obvious flaw in its design. If a patient suddenly takes a downturn, the sensors might not transmit this information for a relatively long time. This obviously makes the system unusable. However, sniff mode has an important feature in that either the master or the slave can request to exit sniff mode. This would allow a sensor to immediately transmit its readings and the alarm can be raised. Of course, for such safety-critical applications, it is also crucial to include a back-up emergency alert system that does not rely on radio. Adding a small piezo-electric beeper to each sensor will not significantly increase its size, cost, or power consumption. This can then be used in conjunction with the unsniff mode or as an emergency back-up if the sensor is unable to communicate with the master.

Park Mode

Park mode is the Bluetooth low power mode that allows the greatest power savings. However, while parked, a device cannot send or receive user data and cannot have an established SCO link. In this mode, the slave does not participate in the piconet, but nevertheless remains synchronized to the channel. This mode has the further advantage of allowing the master to support more than seven slaves by parking some whilst activating others. A parked slave periodically wakes up in order to resynchronize to the channel and listen for broadcast messages. In order to allow this, the master supports a complicated beacon structure that it communicates to the slave at the time it parks it. However, the beacon structure may change and the master then uses broadcast messages to communicate these changes to the parked slaves.

The structure of the beacon channel is covered in detail in other sources; it is sufficient to say here that every beacon interval number of time slots, the master transmits a train of beacons that the slave tries to listen for in order to resynchronize to the channel.

As an application designer, you have to choose the correct beacon interval to save the maximum power whilst maintaining acceptable response times. Response times are governed by how long it takes a slave to request unpark, or how long it takes a Master to unpark a slave, both of which are affected by the park beacon interval.

One factor to consider when choosing the park beacon interval is the clock drift in the devices between successive beacons. If a parked slave loses synchronization, it will stop responding to the master, and may lose the connection altogether. The master will then have to restore the connection by paging it and then parking it again. This is obviously wasteful. Therefore, devices parked for the majority of their duty cycle should have the park beacon intervals set well within the maximum threshold so that if the slave device misses a beacon it can re-synchronize on the next one. So far, park mode sounds very similar to sniff mode. The main difference, however, is that in order to send data packets to a slave, that slave must firstly be unparked (also as mentioned earlier, a slave cannot have an established SCO link when parked). The next section will consider the types of applications suitable for use with park mode.

An application that has been described as being unsuitable for hold mode is one where a Bluetooth-enabled laptop is used for wireless Web browsing. However, the pattern of usage for such an application does make it particularly suitable for park mode. It consists of “bursts” of activity while the user is searching for a particular page, followed by a relatively long period of inactivity while they are reading that page. The slave device can therefore be parked for the majority of the time, while the radio link is not being used. However, when the user needs to send data (assuming the beacon interval is kept relatively short) the slave can be unparked quickly and the request dispatched. Thus, the application can save power whilst keeping response times high. Another advantage of having a short beacon interval is that the slave device has a greater chance of remaining synchronized with the master. As the case study that follows shows, the Headset profile recommends the use of park mode while the headset and Audio Gateway are not actively communicating. This is another good example of an application suited to park mode, since activity is concentrated in bursts, but the response times are bounded by a maximum tolerable latency.

A network of sensors (as discussed previously) is a good example of an application where park mode is not particularly suitable as a low power mode. This is mainly because in order for the sensors to send their data, they would have to be unparked, allowed to transmit, and then parked again. For very short beacon intervals, this is particularly wasteful due to the overhead of the park/unpark procedure. Furthermore, sniff mode perfectly fits the pattern of the application without imposing this extra overhead. This point illustrates quite nicely the conclusion that there is no preferred low power mode. Each of the Bluetooth low power modes is suited to a different class of applications and must be used accordingly in order to achieve optimal performance (in terms of both power consumption and usability).

image Developing & Deploying …

Power Management for the Headset Profile

The Headset profile as defined in the Bluetooth specification (part K-6) is designed to provide two-way audio communications between a headset and an “Audio Gateway,” allowing the user greater freedom of movement while maintaining call privacy. The profile envisages the user wearing a Bluetooth-enabled wireless headset and communicating with, for example, a mobile phone or laptop computer (the Audio Gateway).

This application is a very good example of what could be termed an asymmetrically power-managed application. In this case, the headset has extremely limited energy resources (a coin cell or smaller battery) whose lifetime must be maximized. The Audio Gateway, on the other hand, has considerably greater resources since it is running on a device with a larger battery. The overhead associated with power management should therefore be placed on the Audio Gateway end of the link. By this we mean that not only should the Audio Gateway be responsible for power management on the link but also, if possible, it should use more of its energy resources so that the headset can save more power. Furthermore, as security is an important factor in this application, it is likely that the same user will own both devices and hence it is particularly suitable for asymmetric power management.

A headset must provide pairing functionality, allowing it to set up a link key with the Audio Gateway for security purposes. This is not a state that is likely to be entered frequently since once it is paired, the headset will remain so until it is paired again. The headset must also provide audio transfer functionality being that is what it is designed to do. Each of these states should be considered with respect to power management.

Whilst pairing, the headset should be in discoverable mode (i.e., it should respond to inquiries and also allow the Audio Gateway to connect to it). In this state, power savings can be achieved by reducing the time the headset spends with its radio transceiver powered on. This can be achieved by setting the page scan and inquiry scan intervals so that the radio is powered on for a relatively small fraction of the time. The downside to this is that the Audio Gateway might take slightly longer to find the headset and pair with it, but this delay is not likely to be significant. Furthermore, given that pairing is performed relatively infrequently, this is not a significant overhead.

Once the devices have paired and are ready to connect to each other there are two power-saving strategies to be adopted. The first is saving energy while the devices are attempting to establish an RFCOMM connection, and the second is once the RFCOMM connection has been established—an RFCOMM connection must be established in order for “AT” commands to be exchanged so that the audio link (through the use of a SCO connection) can be set up. This is achieved by placing one device into connectable mode (i.e., into page scan mode and letting the other initiate the creation of the connection. According to the Headset profile, either the headset or the Audio Gateway can initiate the connection attempt. If the headset is in slave mode (waiting for the Audio Gateway to connect to it), then it can employ the same technique used in pairing. It can save power by reducing the time it spends scanning (i.e., with its radio transceiver powered on).

Once an RFCOMM connection has been established, it can be placed in park mode until a SCO connection is needed. This avoids the overhead of establishing an RFCOMM connection (and tearing it down) every time a call is placed to or from the headset. Once a connection has been parked, either end is allowed to unpark it. This is to allow an incoming call to be placed through to the headset so the user can utilize voice dialing and dial out. Once the audio call has been completed, the SCO is disconnected and the RFCOMM connection is placed in park mode once more. It is important to note that neither the RFCOMM nor the L2CAP channels are released during park mode, so the connection can be brought up very quickly when required. However, while the connection is parked, data cannot be transmitted or received. Figure 3.3 shows how an example headset application can use both sniff and park to reduce its power consumption. An RFCOMM connection and an ongoing voice call (SCO connection) are assumed to exist between the two devices. The first diagram shows that as soon as the voice call is disconnected the RFCOMM link is placed in park mode. Note that either the headset or the Audio Gateway may initiate park. If at some later time either end wishes to transmit data, the connection must first be unparked. Once again, either device may initiate the unpark. At this point zero or more data packets may be sent and a SCO connection may be initiated. The link cannot be parked until the SCO (if created) has been released and there is no data pending transmission. The second diagram in Figure 3.3 shows how sniff mode can also be used by the headset. If, for example, either device expects to have data to transmit shortly after the voice call is disconnected and does not want to incur the overhead associated with entering park mode, it can place the link into sniff mode. In this state, the headset can transmit its button press without exiting sniff. Furthermore, a SCO connection can be set up while still in sniff mode allowing the devices to conserve energy even while there is an ongoing voice call. Figure 3.3 shows that an application is not restricted to using just one of the Bluetooth low power modes, and by using more than one mode it can adapt better to its usage.

image

Figure 3.3 Headset Use of Park and Sniff Modes

Evaluating Consumption Levels

As discussed earlier, the Bluetooth low power modes have different characteristics and are suited to different classes of applications. Each low power mode also has a different cost in terms of energy consumption. The power consumption of a device is influenced by the hardware used, the low power parameters negotiated, and the type of application it is running. This section will aim to give a very general indication of the relative power consumption characteristics of the Bluetooth low power modes. Absolute values for the average current consumption in each mode are meaningless since it is highly dependent on the underlying hardware. This section will therefore concentrate on the relative power consumption of some of the Bluetooth low power modes.

Figure 3.4 shows a comparison of the average current consumption of a device using different Bluetooth low power modes. Transmission of ACL data has the greatest power cost and will be used as a benchmark against which to compare the other modes. As can be seen, a device in sniff mode consumes more current than a parked device. It is also important to note that the interval used while in sniff or park mode also affects its power consumption. The shorter the sniff interval or park beacon used, the more current the device will consume as it has to “wake up” more frequently in order to service that interval. Of course, the trade off is that the shorter the interval, the lower the communication latency. As you can see, there is always a trade off that has to be made between power consumption and latency.

image

Figure 3.4 Relative Current Consumption for Different Bluetooth Low Power Modes

A device must be in inquiry scan mode in order to be discoverable. Similarly, in order to be connectable, the device must be in page scan mode. Of course, both modes can also be enabled simultaneously. As can be seen from Figure 3.4, inquiry and page scan have a current consumption cost associated with them, and as such, should be used only when necessary. For example, if we only need the device to be connectable, then enabling inquiry scan will almost double the current consumption of the device but will not give it the functionality actually needed. Furthermore, as can be seen, the scan interval (denoted by i in the graph) and window (denoted by w in the graph) also have an effect on power consumption, so they should be chosen with care.

Although the graph in Figure 3.4 gives only a very approximate idea of the relative energy consumption costs of the different Bluetooth low power modes, it is easy to see that significant advantages can be gained by having an application use one or more of these modes.

Summary

This chapter has described the properties of power-managed applications and provided a discussion of why applications for Bluetooth-enabled devices can benefit from the use of power management. It has also detailed the different Bluetooth low power modes, illustrating the use of each one with example applications.

Power-managed applications allow the device to power down for a large part of its duty cycle thus saving energy and prolonging its battery life. However, the drawback is that the response time of the application is increased and, if not used correctly, power management can make applications infuriatingly unresponsive. This also means that the application allowing the underlying hardware to power down should be completely transparent to the end user. Bluetooth provides a number of low power modes and each one is suited to a different type of application. Before deciding on the power management mode to use, the maximum allowed latency and expected radio traffic pattern of the application must be considered. Applications with a very low latency or requirements to transmit very frequently might even make it inefficient to use a low power mode due to the overhead incurred in entering and exiting it.

Bluetooth provides three low power modes for application designers to use, hold, sniff, and park. Each mode has different characteristics and is suitable for a different class of application. Hold mode is suitable for applications that can predict or control the time of their next data transmission. As each hold interval is negotiated independently of subsequent ones, this mode is suitable for adaptive power management where the application monitors the usage of the link and increases or decreases its sleep time accordingly. Hold mode cannot be exited and therefore should not be used for applications with hard latency requirements.

Sniff mode allows a Bluetooth-enabled device to save power by reducing the number of slots that the master can transmit in, thereby reducing the slots the slave must listen to. This mode is more flexible than hold mode as it can be exited at any time. The slave listens periodically for a number of slots and this makes sniff mode particularly suitable for use in applications where data regularly requires transmission. Applications that are not suitable for sniff mode are ones that frequently require large data transfers that force the device to remain awake beyond its sniff interval. This does not have a detrimental effect on the application’s performance, but it does not allow the device to achieve its full power saving potential either.

Park mode is the mode that allows greatest power savings to be made. This mode is best suited for applications where the radio traffic pattern is unpredictable and the connection establishment latency is bounded by some upper limit. The Headset profile (from the Bluetooth specification) is a good example of such an application. The RFCOMM link must be unparked as soon as possible, once a call needs to be put through from the Audio Gateway to the headset.

The Bluetooth low power modes are different in the power management support they provide and there is therefore no single mode that is best to use. The low power mode used is determined by a wide range of factors dependent on the type of application and its requirements. When considering which Bluetooth low power mode an application should use, the main factors to consider are:

image Whether the application is suitable for power management

image What is the maximum latency the application can tolerate

image What is the expected radio traffic pattern (random, periodic, bursty, and so on)

Solutions Fast Track

Using Power Management: When and Why Is It Necessary?

image Consider whether your application is suitable for power-managed operation.

image Consider the constraints imposed by the application (e.g., maximum response times, characteristics of the data traffic, and so on).

Investigating Bluetooth Power Modes

image Hold mode One-off event, allowing a device to be placed into hold mode for a negotiated period of time. Hold interval must be negotiated each time this mode is entered.

image Sniff mode Slave periodically listens to the master and can power save for the remainder of the time. Important to note that data can be transferred while devices are in this mode and a SCO link may be active. Sniff intervals are negotiated once, before sniff is entered, and remain valid until sniff mode is exited.

image Park mode Parked slave periodically synchronizes with the master and for the remainder of the time can power save. Data packets cannot be sent on a parked connection and the devices must be unparked before a SCO connection can be established. Furthermore, there cannot be an active SCO when its associated ACL is parked.

Evaluating Consumption Levels

image All other things being equal, the power consumption of a Bluetooth low power mode depends on the parameters negotiated before that mode is entered.

image Page and inquiry scan also have a power consumption cost, so these should be entered only when necessary.

Frequently Asked Questions

The following Frequently Asked Questions, answered by the authors of this book, are designed to both measure your understanding of the concepts presented in this chapter and to assist you with real-life implementation of these concepts. To have your questions about this chapter answered by the author, browse to www.syngress.com/solutions and click on the “Ask the Author” form.

Q: Why don’t low power modes work with different version Bluetooth devices?

A: Between version 1.0b and 1.1, improvements were made to the link management protocol messages, which put a device in hold, park, or sniff mode. These improvements made entering the low power modes much more reliable. However, because the protocol messages have changed, devices which have the old version of the protocol cannot work with the new version.

Q: Which versions of the Bluetooth specification are compatible for low power modes?

A: The changes in the link management protocol messages were first introduced as errata to the 1.0b specification. Changes, which were required to interoperate with version 1.1 of the specification, were labeled “critical errata.” So:

image “1.0b plus critical errata” should be compatible with 1.1.

image 1.0b is not compatible with 1.1 or “1.0b plus critical errata.”

image Any version should be compatible with the same version, but there have been interoperability problems with older versions, caused by ambiguity in the specification.

Q: What is the best power saving mode to use?

A: There is no “best” mode, it depends upon the requirements of your application. Look at the case studies in this chapter and consider the requirements of your particular application to decide which power saving mode is best for you.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.152.157