Power Management
Bluetooth technology finally makes the mobile application a reality. Not only can users be mobile whilst connected but radio networks can also be used in places where fixed infrastructure is too expensive, dangerous, or difficult to deploy. This, however, leaves you with the difficulty that all these devices must be powered using batteries, which have to be frequently recharged or replaced. If the Bluetooth device uses too much power, this can become a real problem.
As an applications designer, you may think there is nothing you can do about the problem—after all, you have no control over the amount of power your hardware consumes. The good news for Bluetooth applications is that designers do have the ability to do something about improving the power efficiency of their application. The Bluetooth specification offers a range of power-saving features, tailored to suit the needs of different applications, which can give your applications a real edge.
The drawback (and there always is one) is that if you use these features badly, you will slow down the response time of your application, making it infuriating to use. This chapter will tell you how to get the best of both worlds: save power while still producing usable applications.
Before going further, its worth spending a little time defining what a power managed application actually is and exploring some of the reasons why such applications are necessary. A power-managed application is one that allows the device it is running on to go into sleep mode for significant portions of its duty cycle. Sleep mode need not involve powering down the whole device; in fact, this is highly unlikely, as certain functional blocks will always need to be powered. However, when a device is in sleep mode it should be consuming significantly less power than when it is fully “awake,” otherwise power management will be a waste of time.
A further characteristic of application level power management is that it should not adversely affect the performance of the application. In fact, the user should not be aware that your application is using power management and that the Bluetooth device is not constantly powered on. Powering down a device at the wrong time can not only result in almost no energy being saved, but it can also make an application virtually unusable by making it slow to respond. Let’s consider the example of a wireless headset and a mobile phone. If the headset is powered down at the wrong time, the phone will not be able to notify it of an incoming call. Even though the headset may be saving significant amounts of power, as far as the user is concerned, it is unusable, because it cannot receive calls in a timely manner.
So, if power management has the potential to make your application unusable or infuriatingly slow, why bother with it? Used in the correct way, the Bluetooth power management modes have the potential to extend the battery life of your device significantly, yet be completely transparent to the user. In general, users do not like having to lug about heavy batteries or recharge their devices frequently. A typical mobile phone has a small battery and yet can last several days without recharging. If adding Bluetooth functionality to such a phone reduces its average battery life significantly, it is unlikely to be popular with the user. Power management at both the hardware and software levels of Bluetooth technology is therefore necessary in order to make these networks viable. A further benefit of application power management is that the energy savings are independent of the underlying technology. This means that if through power management you double the battery life of your device, this will hold true even if the power consumption of the underlying hardware was significantly improved.
A relatively minor, but nevertheless important, point to consider is who owns the devices that are being power managed. Often greater power savings can be achieved by one device at the expense of the energy resources of another. An obvious example would be where a device is powered down for the majority of its duty cycle while another device buffers packets destined for it and therefore must be constantly powered on. Periodically, the first device wakes up to pick up these packets, acts on them if necessary and then powers down again. Thus, the first device can achieve very high power savings at the expense of the buffering device. If the same user owns both devices (and especially if one of those devices can be mains powered, e.g., a PC) then this is a very good approach to achieving high power savings. However, if the devices belong to different users then there is an obvious conflict of interests as both users might be keen to prolong the battery life of their particular device rather than altruistically providing a service for others. In this case, a scheme where both devices achieve some, but not maximal, power savings may be a better compromise rather than having no power saving at all. The anticipated uses of a power-managed application can therefore be important in choosing the power management approach taken.
Having discussed how useful power-managed applications can be, it is worth looking at what types of applications are suitable for these techniques and which ones will have their performance adversely affected by power management. The first thing to remember is that in order to save power, the device must be put into sleep mode. Applications that require large amounts of data to be sent or received, or that need very fast response times, are not suitable for power management. On the other hand, applications requiring small amounts of data to be transmitted or where data transfers are infrequent are very well-suited to being powered down for the majority of the time they are inactive. Similarly, applications where a delay in the response time can be tolerated should also consider power management.
Before choosing a given Bluetooth power management mode to use with your application you should consider the maximum amount of time the device can be powered down without adversely affecting the performance of your application. In general, when using power management, an application designer trades off an increase in latency and a decrease in data throughput for an increase in the battery life of the device running the application. The following sections will discuss the Bluetooth power management modes and the use of each mode in the context of different types of applications.
For most applications, if a connection exists between two or more Bluetooth-enabled devices, one of the Bluetooth low power modes can be used to extend the battery life of either some or all of these devices. In fact, power-managed devices can be in one of four states, listed in order of decreasing power consumption: active, hold, sniff, and park mode. Each of these low power modes will be described, along with a discussion of what type of applications will and will not be suitable for it.
In active mode, the device actively participates on the radio channel. The master schedules data transmissions as necessary and the slaves must listen to all active master-slave slots for packets that may be destined for them. This mode is a useful benchmark for comparison with the performance of the low power modes since it not only consumes the most power but also has the highest achievable data throughput due to the devices being able to use all available slots. The power consumption of Bluetooth devices is highly dependent on the manufacturer of the device and the application that it is running. Furthermore, as the technology matures, the power consumption of Bluetooth-enabled devices will improve further and hence it is best to compare low power modes relative to the active mode.
We will briefly discuss the type of applications best suited to active mode, which are unlikely to benefit or be able to utilize any of the other low power modes. An application that has very high data rate requirements is unlikely to power save as it will need to have its radio transceiver powered on for the majority of its duty cycle. Similarly, applications that require very low latencies are also unlikely to be able to use the low power modes since they will power down for such short periods that the overhead in powering down the device will be greater than the energy saving made (or powering down for longer periods will mean the application is no longer able to conform to its latency requirements).
This is the simplest of the Bluetooth low power modes. The master and slave negotiate the duration that the slave device will be in hold mode for. Once a connection is in hold mode, it does not support data packets on that connection and can either power save or participate in another piconet. It is important to note that the hold period is negotiated each time hold mode is entered. Figure 3.1 shows what the interaction between two devices using hold mode might look like. A further important aspect of hold mode is that once it has been entered, it cannot be cancelled and the hold period must expire before communications can be resumed.
Given these constraints, what type of application would benefit from using hold mode? If your application can determine or control the time of its next data transmission, then it can most probably use hold mode for power management. One example of an application that has some degree of control over when its next data transmission should take place is a wireless e-mail delivery system. E-mail is not a synchronous communications medium and messages can take anything from a few seconds to several hours to be delivered to their destination. More importantly, users do not perceive e-mail delivery to be instantaneous and hence would tolerate a small additional delay in favor of extending the battery life of their device. The following sidebar, “Power Management Using Hold Mode,” discusses in more detail how hold mode can be used by such an application, along with power saving techniques available.
A very different candidate for hold mode is one which relies on the use of a SCO link and does not need to send data packets. Furthermore, if the application can tolerate a poorer audio quality it can use fewer slots and hence power down for longer periods of time. For example, a baby monitor needs to have an active SCO link but does not need the ACL link. Also, given that parents are mainly interested in detecting whether the baby is crying or not, this application could probably get away with a slightly poorer quality of audio. By placing the ACL link in hold mode for relatively long periods of time and reducing the quality of the SCO link, the application can achieve greater power savings.
Having discussed application types able to benefit from using hold mode, we will briefly consider applications that should not use this mode, being it’s likely to have a negative impact on performance. Hold mode is not suitable for applications whose traffic pattern is unpredictable and which cannot tolerate unbounded communication latencies. An obvious example is a device that allows a user to browse the Web over a wireless link. Even though access to the World Wide Web is notorious for being slow, if this latency is further increased by using hold mode, the application becomes too frustrating to use. At this point, it’s worth remembering that once entered, hold mode cannot be exited until the negotiated hold interval has expired. Furthermore, the traffic pattern of such an application is impossible to predict due to the nature of Web browsing. The user may make a number of page requests in quick succession whilst browsing for a particular page. However, once the page has been found, they may spend considerably longer looking at the page and not need the use of the wireless link for some time.
A very different application type whose performance will be negatively impacted is a network of sensors which need timely delivery of their data—for instance, intruder detection. Once a sensor has been triggered, fast delivery of this information to the control center is imperative. A sensor with a long battery life that spends much of its day powered down may just give an intruder time enough to avoid being caught.
This low power mode achieves power savings by reducing the number of slots in which a master can start a data transmission and correspondingly reducing the slots in which the slaves must listen. The time interval, Tsniff, between the slots when a master can start transmitting is negotiated between the master and slave when sniff mode is entered. When the slave listens on the channel it does so for Nsnff attempt slots and can then power down until the end of the current sniff interval. The time of reception of the last data packet destined for the slave is important, as the slave must listen for at least Nsniff timeout after the last packet is received.
Figure 3.2(A) shows the lower bound of the number of slots that the slave must listen. In this case it just listens for Nsniff attempt. This happens if the last packet for the slave is received when there are more than Nsniff timeout slots remaining in the sniff attempt. The slave just listens for the remainder of the sniff attempt interval and can then power down.
Conversely, Figure 3.2(B) shows a slave listening for an extended period. In this case the slave listens Nsniff attempt, then receives a packet and listens for a further Nsniff timeout slots. This shows how the slave must listen for a further Nsniff timeout slots if the last packet is received when there are less than Nsniff timeout slots left in its sniff attempt interval. If the slave continued receiving packets it would continue listening for Nsniff timeout slots after the last packet is received, so if the master kept on transmitting the slave would remain continuously active.
The slave can vary its activity from just Nsnff attempt slots thru (Nsnff attempt + Nsniff timeout) slots, and even go all the way to continuously active, all without renegotiating any parameters. You can therefore see that by choosing suitable values for the sniff interval and the number of slots that the slave listens for, power savings can be achieved without adversely affecting the performance of the application.
This section will consider what types of applications are suitable for use with sniff mode and which are not. Sniff mode is more flexible than hold mode since either the master or the slave can request for sniff mode to be exited. However, there is a trade off in the overhead associated with exiting sniff mode and it is more advantageous to choose the sniff mode parameters so as to minimize the likelihood of exit. Since sniff mode requires the slave device to periodically wake-up and listen to the radio channel, it is particularly well-suited to applications where devices regularly transmit (or receive) data. An example of such an application is discussed in the case study that follows. Sniff mode can also be used when there is an active SCO link. Once again, by accepting a slight degradation in the audio quality, power savings can be achieved since SCO links using HV2 or HV3 packets can be placed into sniff mode (note that SCO links using HV1 packets can also be placed into sniff, but in this case it will not have much effect since the device is transmitting in every slot).
Another set of applications that could use sniff mode are ones where the devices can aggregate data and maybe even do a limited amount of processing before communicating with the master. Thus, not only the frequency of communication can be reduced, but also the actual amount of data transmitted. Once again, sensor networks are an obvious area of application. For example, a traffic monitoring system would be wasting resources transmitting every second the number of cars that have passed through a given point. Since the information is not time-critical, the update frequency can be decreased (i.e., the car count is aggregated at the sensor without affecting the performance of the system). However, this need not be limited to sensor applications—for example, the e-mail delivery system described in the previous example could be implemented using sniff mode instead of hold mode.
Application types not particularly well-suited to using sniff mode are ones frequently requiring relatively large data transfers. In this case, the time necessary to transmit the data is important, because if it takes too much time, your application will not be able to power down for very long, if at all. The application itself will not see a degradation in performance, but it will not achieve any power savings either.
Park mode is the Bluetooth low power mode that allows the greatest power savings. However, while parked, a device cannot send or receive user data and cannot have an established SCO link. In this mode, the slave does not participate in the piconet, but nevertheless remains synchronized to the channel. This mode has the further advantage of allowing the master to support more than seven slaves by parking some whilst activating others. A parked slave periodically wakes up in order to resynchronize to the channel and listen for broadcast messages. In order to allow this, the master supports a complicated beacon structure that it communicates to the slave at the time it parks it. However, the beacon structure may change and the master then uses broadcast messages to communicate these changes to the parked slaves.
The structure of the beacon channel is covered in detail in other sources; it is sufficient to say here that every beacon interval number of time slots, the master transmits a train of beacons that the slave tries to listen for in order to resynchronize to the channel.
As an application designer, you have to choose the correct beacon interval to save the maximum power whilst maintaining acceptable response times. Response times are governed by how long it takes a slave to request unpark, or how long it takes a Master to unpark a slave, both of which are affected by the park beacon interval.
One factor to consider when choosing the park beacon interval is the clock drift in the devices between successive beacons. If a parked slave loses synchronization, it will stop responding to the master, and may lose the connection altogether. The master will then have to restore the connection by paging it and then parking it again. This is obviously wasteful. Therefore, devices parked for the majority of their duty cycle should have the park beacon intervals set well within the maximum threshold so that if the slave device misses a beacon it can re-synchronize on the next one. So far, park mode sounds very similar to sniff mode. The main difference, however, is that in order to send data packets to a slave, that slave must firstly be unparked (also as mentioned earlier, a slave cannot have an established SCO link when parked). The next section will consider the types of applications suitable for use with park mode.
An application that has been described as being unsuitable for hold mode is one where a Bluetooth-enabled laptop is used for wireless Web browsing. However, the pattern of usage for such an application does make it particularly suitable for park mode. It consists of “bursts” of activity while the user is searching for a particular page, followed by a relatively long period of inactivity while they are reading that page. The slave device can therefore be parked for the majority of the time, while the radio link is not being used. However, when the user needs to send data (assuming the beacon interval is kept relatively short) the slave can be unparked quickly and the request dispatched. Thus, the application can save power whilst keeping response times high. Another advantage of having a short beacon interval is that the slave device has a greater chance of remaining synchronized with the master. As the case study that follows shows, the Headset profile recommends the use of park mode while the headset and Audio Gateway are not actively communicating. This is another good example of an application suited to park mode, since activity is concentrated in bursts, but the response times are bounded by a maximum tolerable latency.
A network of sensors (as discussed previously) is a good example of an application where park mode is not particularly suitable as a low power mode. This is mainly because in order for the sensors to send their data, they would have to be unparked, allowed to transmit, and then parked again. For very short beacon intervals, this is particularly wasteful due to the overhead of the park/unpark procedure. Furthermore, sniff mode perfectly fits the pattern of the application without imposing this extra overhead. This point illustrates quite nicely the conclusion that there is no preferred low power mode. Each of the Bluetooth low power modes is suited to a different class of applications and must be used accordingly in order to achieve optimal performance (in terms of both power consumption and usability).
As discussed earlier, the Bluetooth low power modes have different characteristics and are suited to different classes of applications. Each low power mode also has a different cost in terms of energy consumption. The power consumption of a device is influenced by the hardware used, the low power parameters negotiated, and the type of application it is running. This section will aim to give a very general indication of the relative power consumption characteristics of the Bluetooth low power modes. Absolute values for the average current consumption in each mode are meaningless since it is highly dependent on the underlying hardware. This section will therefore concentrate on the relative power consumption of some of the Bluetooth low power modes.
Figure 3.4 shows a comparison of the average current consumption of a device using different Bluetooth low power modes. Transmission of ACL data has the greatest power cost and will be used as a benchmark against which to compare the other modes. As can be seen, a device in sniff mode consumes more current than a parked device. It is also important to note that the interval used while in sniff or park mode also affects its power consumption. The shorter the sniff interval or park beacon used, the more current the device will consume as it has to “wake up” more frequently in order to service that interval. Of course, the trade off is that the shorter the interval, the lower the communication latency. As you can see, there is always a trade off that has to be made between power consumption and latency.
A device must be in inquiry scan mode in order to be discoverable. Similarly, in order to be connectable, the device must be in page scan mode. Of course, both modes can also be enabled simultaneously. As can be seen from Figure 3.4, inquiry and page scan have a current consumption cost associated with them, and as such, should be used only when necessary. For example, if we only need the device to be connectable, then enabling inquiry scan will almost double the current consumption of the device but will not give it the functionality actually needed. Furthermore, as can be seen, the scan interval (denoted by i in the graph) and window (denoted by w in the graph) also have an effect on power consumption, so they should be chosen with care.
Although the graph in Figure 3.4 gives only a very approximate idea of the relative energy consumption costs of the different Bluetooth low power modes, it is easy to see that significant advantages can be gained by having an application use one or more of these modes.
This chapter has described the properties of power-managed applications and provided a discussion of why applications for Bluetooth-enabled devices can benefit from the use of power management. It has also detailed the different Bluetooth low power modes, illustrating the use of each one with example applications.
Power-managed applications allow the device to power down for a large part of its duty cycle thus saving energy and prolonging its battery life. However, the drawback is that the response time of the application is increased and, if not used correctly, power management can make applications infuriatingly unresponsive. This also means that the application allowing the underlying hardware to power down should be completely transparent to the end user. Bluetooth provides a number of low power modes and each one is suited to a different type of application. Before deciding on the power management mode to use, the maximum allowed latency and expected radio traffic pattern of the application must be considered. Applications with a very low latency or requirements to transmit very frequently might even make it inefficient to use a low power mode due to the overhead incurred in entering and exiting it.
Bluetooth provides three low power modes for application designers to use, hold, sniff, and park. Each mode has different characteristics and is suitable for a different class of application. Hold mode is suitable for applications that can predict or control the time of their next data transmission. As each hold interval is negotiated independently of subsequent ones, this mode is suitable for adaptive power management where the application monitors the usage of the link and increases or decreases its sleep time accordingly. Hold mode cannot be exited and therefore should not be used for applications with hard latency requirements.
Sniff mode allows a Bluetooth-enabled device to save power by reducing the number of slots that the master can transmit in, thereby reducing the slots the slave must listen to. This mode is more flexible than hold mode as it can be exited at any time. The slave listens periodically for a number of slots and this makes sniff mode particularly suitable for use in applications where data regularly requires transmission. Applications that are not suitable for sniff mode are ones that frequently require large data transfers that force the device to remain awake beyond its sniff interval. This does not have a detrimental effect on the application’s performance, but it does not allow the device to achieve its full power saving potential either.
Park mode is the mode that allows greatest power savings to be made. This mode is best suited for applications where the radio traffic pattern is unpredictable and the connection establishment latency is bounded by some upper limit. The Headset profile (from the Bluetooth specification) is a good example of such an application. The RFCOMM link must be unparked as soon as possible, once a call needs to be put through from the Audio Gateway to the headset.
The Bluetooth low power modes are different in the power management support they provide and there is therefore no single mode that is best to use. The low power mode used is determined by a wide range of factors dependent on the type of application and its requirements. When considering which Bluetooth low power mode an application should use, the main factors to consider are:
Hold mode One-off event, allowing a device to be placed into hold mode for a negotiated period of time. Hold interval must be negotiated each time this mode is entered.
Sniff mode Slave periodically listens to the master and can power save for the remainder of the time. Important to note that data can be transferred while devices are in this mode and a SCO link may be active. Sniff intervals are negotiated once, before sniff is entered, and remain valid until sniff mode is exited.
Park mode Parked slave periodically synchronizes with the master and for the remainder of the time can power save. Data packets cannot be sent on a parked connection and the devices must be unparked before a SCO connection can be established. Furthermore, there cannot be an active SCO when its associated ACL is parked.
The following Frequently Asked Questions, answered by the authors of this book, are designed to both measure your understanding of the concepts presented in this chapter and to assist you with real-life implementation of these concepts. To have your questions about this chapter answered by the author, browse to www.syngress.com/solutions and click on the “Ask the Author” form.
Q: Why don’t low power modes work with different version Bluetooth devices?
A: Between version 1.0b and 1.1, improvements were made to the link management protocol messages, which put a device in hold, park, or sniff mode. These improvements made entering the low power modes much more reliable. However, because the protocol messages have changed, devices which have the old version of the protocol cannot work with the new version.
Q: Which versions of the Bluetooth specification are compatible for low power modes?
A: The changes in the link management protocol messages were first introduced as errata to the 1.0b specification. Changes, which were required to interoperate with version 1.1 of the specification, were labeled “critical errata.” So:
“1.0b plus critical errata” should be compatible with 1.1.
1.0b is not compatible with 1.1 or “1.0b plus critical errata.”
Any version should be compatible with the same version, but there have been interoperability problems with older versions, caused by ambiguity in the specification.
Q: What is the best power saving mode to use?
A: There is no “best” mode, it depends upon the requirements of your application. Look at the case studies in this chapter and consider the requirements of your particular application to decide which power saving mode is best for you.
3.145.202.61