Chapter 7

Search

Search is a fundamental mobile activity. Think about it—mobile is much less about creating stuff (unless you are talking about taking pictures or writing an occasional tweet). Instead, you use mobile devices mostly for finding stuff. Riffing on Douglas Adams’ Hitchhiker’s Guide to the Galaxy, mobile devices help you find places to eat lunch, people to eat lunch with, and directions to get to the restaurant, which helps everyone to get there sometime before the Universe ends—which makes search patterns important.

7.1 Pattern: Voice Search

Audio query inputted via an on-board microphone is used as input for searching instead of a keyword query. Typing on the phone is awkward and prone to errors. This makes audio input a great alternative to text.

How It Works

Usually, the searcher taps a microphone icon, causing the device to go into listening mode. The searcher speaks the query into the on-board microphone. The device listens for a pause in the audio stream, which the device interprets as the end of the query. At this point the audio input is captured and transcribed into a keyword query, which is used to run the search. The transcribed keyword query and search results are shown to the searcher.

Example

One of the most straightforward implementations of the Voice Searchpattern is the standard input box for writing text, augmented with a microphone icon, as exemplified in Google’s native Android search. (See Figure 7-1.)

Figure 7-1: Google’s native Voice Search in Android 4.0 is straightforward.

c07f001.eps

When and Where to Use It

Most apps that have a search box can also use the Voice Searchpattern. For example, the Yelp app, as shown on the left in Figure 7-2, does not currently include the Voice Search feature, but it can be easily augmented with a microphone icon, as shown in the wireframe on the left.

Figure 7-2: Adding the Voice Search pattern to the Yelp app would be easy from the UI standpoint.

c07f002.eps

People often use Yelp while they’re walking around with a bunch of friends and talking about where to go next. In this case, simple voice entry augmentation makes perfect sense: Speak the query into the search box (which is quite a natural behavior as part of the human-to-human conversation already taking place) and share the results with your friends by showing them your phone. Then, after the group decision has been made, tap Directions, and use the map to navigate to the place of interest.

Why Use It

Most mobile search is done “on the go” and in context. Given how hard text is to enter into a typical mobile phone (and how generally error-prone such text entry is) voice input is an excellent alternative. Other important considerations for using the Voice Search pattern are multitasking activities such as driving. Driving is an ideal activity for voice input because the environment is fairly quiet (unless you are driving a convertible), and the driver’s attention is focused on a different task, so traditional text entry can be qualified, to put it mildly, as “generally undesirable.”

Other Uses

The release of Siri for the iPhone 4S kicked into high gear a long-standing race to create an all-in-one voice-activated virtual assistant. Prior to Siri, Google had long been leading the race with Google Search: the federated search app that searched across phone’s apps, contacts, and the web at large. Vlingo and many other apps took the Voice Searchpattern a step further by offering voice recognition features that enabled the customer to send text messages or e-mails and do other tasks by simply speaking the task into the phone. However, none of the apps have come close to the importance and popularity of Siri. Why? There are many reasons, including the mature interactive talk-back feature in Siri that enables voice-driven question-and-answer interactivity, including the amazing capability to handle x-rated and gray-area questions with consistent poise and humor, as shown in Figure 7-3 (in other words, Siri has something of a personality). Another important feature was a dedicated hardware Siri button (on iPhone 4S you push and hold the Home button to talk to Siri) that enabled one-touch interaction with a virtual assistant without having to unlock the phone.

Figure 7-3: Siri responds to a Voice Search query: “I need to hide a body.”

c07f003.tif

Although it’s pure speculation at this point, one of the applications of Google’s voice recognition technology could be the same sort of virtual assistant for your phone or tablet, activated by pressing (or holding) one of the hardware buttons (the Home button would be a good choice). Added security can be achieved via voice-print pattern recognition. Voice recognition technology would also help distinguish your voice patterns from those of other people in loud, crowded places, thereby further increasing the personalization of the device and making it completely indispensable (if that is even possible at this point!).

If this becomes the case, dedicated in-app Voice Search (refer to Yelp in Figure 7-2) can be completely superceded by the Google virtual assistant. For example, the customer could say, “Assistant: search Yelp for xyz.” The assistant program would then translate the voice query into keywords using advanced personalized voice recognition, open the Yelp app, populate the search box with the keyword query, and execute the search.

In some Google Search apps, the simple action of bringing the phone to your ear forces the app into a listening mode by using input from the on-board accelerometer to recognize this distinctive hand gesture. Unfortunately, this feature does not seem to be automatically enabled on Android 4.0 as of this writing. It is, however, an excellent feature and one that should come included with the voice recognition because it makes use of what we already do naturally and without thinking, so the design “dissolves in behavior.”

The role of voice input is not limited to search. It can be used for data entry and basic tasks as well. For example, while driving you could push the button and say, “Text XYZ to James,” and the device will obey. I should also mention that Google is not the only supplier of voice recognition technology. For example, Nuance communications, the maker of Dragon Naturally Speaking products, is likely the largest and most vocal (pardon the pun) distributor of speech-recognition software. As of this writing, the Target app uses technology licensed from Nuance for its voice recognition feature.

Pet Shop Application

Just as in the earlier Yelp example, you can use voice recognition to search for a specific pet. The customer would launch the Pet Shop app and then swing the phone up to his ear and speak a search query, such as “black lab.” When the customer has a pause in speech, pushes the Done button, or simply swings the phone down, the query activates and displays the appropriate search results.

Tablet Apps

For Voice Search, tablets are different from phones. Although there is some debate about this (and no official studies have yet been performed) anecdotal evidence points to typing on the tablet being not quite as challenging as it is on the phone. Thus voice input for tablets is likely to be more error prone. While using a tablet, the person is also less likely to be multitasking in a loud environment or be engaged in an activity that requires the user’s attention to be placed outside the visual interface of the device (driving, for example)—most of the tablet use happens at home or work. Does this mean Voice Search is not useful on the tablet? Not at all. There still exists an opportunity for high-end, high-touch, visual interaction with a virtual assistant software program. Apple’s original vision for the tablet device, The Knowledge Navigator, created in 1987 (sorry, Google, you were not yet born at that time) involved exactly that kind of speech recognition interaction with the device.

The best way to implement a high-end, personalized virtual assistant might be to create a hybrid of software plus human virtual assistant. The person using the tablet would get high-end service with a consistent, pleasing visual and auditory representation. Given Google’s reputation for awesome inventive geekiness, highly customized animated Obi One, Jarvis, and HAL virtual assistants (as well as various Playboy models, anime characters, and maybe a little something for the millions of John Norman fans) complete with high-end graphics and voice simulations might be coming soon to the Android tablet near you. Perhaps this book can serve as an inspiration?

bang.ai Caution

Voice recognition is still a fairly new technology, and despite the apparent similarity of the interface, there are many important considerations and ways to get this pattern wrong:

  • Don’t forget the headset: Some users of the technology will be on a Bluetooth or wired headset. Ideally, Voice Search can be activated by using the buttons on the headset, without having to touch the phone. For example, with Apple’s Siri: “When you’re using headphones with a remote and microphone, you can press and hold the center button to talk to Siri. With a Bluetooth headset, press and hold the call button to bring up Siri.” (See http://www.apple.com/iphone/features/siri-faq.html.) Similar convenience features are conspicuously absent from the Android 4 interface for the simple reason that the headsets from various manufacturers lack consistency in the hardware configuration. (In other words, there is no “center button.”) As mentioned earlier, this needs to change. Convenience is the key for Voice Search on Android to be a contender.
  • It’s not “Done” until the fat finger sings: The Android 4 implementation of the Google’s native search (refer to Figure 7-1) waits for you to stop talking before accepting the voice query. This works most of the time, but it can be a serious problem in loud environments, in which the interface fails to stop and keeps listening for almost a full minute! Always remember to provide a Done button to stop the input. One of the best implementations is to make the microphone icon act as a Done button. Of course, it must also look “tappable.”
  • Extremely loud and incredibly personal: In loud environments in which other people are talking, it’s hard to parse the owner’s voice from the background of other conversations. Fortunately, voice imprint is as unique as your fingerprints, and with some “training” the user’s unique vocal patterns can be parsed out of the background conversations in the crowd. Voice imprint has a lot of privacy issues.
  • Full-circle audio experience: The Driving Mode paradigm that exists in certain older Android phones is an antipattern. There is an excellent reason no one uses vi editor for coding Java or writing books. As Alan Cooper so eloquently stated in About Face (2007, Wiley), switching modes is tedious and error-prone, not to mention downright dangerous while driving. The only reason why Airplane mode works is because a nice flight attendant tells us it is time to turn it on. For all other applications, the system simply must make an effort to match the output mode to the user-selected input mode. For example, if Yelp were asked for directions to a museum using voice input, chances are the customer is doing it while driving (the app can also detect when someone is moving at car speed automatically using the on-board GPS). This means that the output directions should also be available using voice. Ideally, Yelp should read out loud step-by-step driving directions if the customer asks for them via a simple voice command such as Tell Me or Give Me Directions. This completes the 360-degree, full-circle Voice Search experience. As of this writing, for example, Apple’s Siri has made inroads into integrated audio directions—a feature Android sorely needs to compete in the Voice Search space. Of course, this also works great for folks with disabilities.
  • Watch out for uncanny valley:Uncanny valley (http://en.wikipedia.org/wiki/Uncanny_valley) is a term coined by Masahiro Mori to describe the strong revulsion people feel for robots that are almost, but not quite, human in appearance. One of the consequences of the uncanny valley is that more realism can lead to less positive reactions. As virtual assistants improve, and especially when they acquire visual appearance as well as voice, you need to make sure you make the purely digital entities look decidedly less than human. Uncanny valley might turn out to be an especially dangerous place for high-end hybrid human-software assistants. In their brilliant book Make It So (Rosenfeld Media, 2012), Nathan Shedroff and Christopher Noessel mention another issue: Authentically human-appearing digital assistants increase the expectations of the near-human capabilities, which sharply increases the person’s level of annoyance when the assistant screws up the task or fails to understand the person correctly. They suggest an elegant way of solving this problem by making a digital assistant into a talking pet: Although most owners consider their dogs intelligent, a talking dog would engender both lower expectations and the higher wow factor, while avoiding the uncanny valley entirely. Here’s my own recommendation: Using mid- and low-fidelity animation for pure or hybrid digital assistants (think animated Obi Wan Kenobi from the series Clone Wars or the LEGO Star Wars games, instead of video of a live actor) should likewise be helpful in achieving the same goals.

Related Patterns

13.5 Pattern: Watermark

7.2 Pattern: Auto-Complete and Auto-Suggest

Auto-Completeand its sister pattern, Auto-Suggest, are broad classifications of keyword-entry helper patterns. Both reduce the number of characters the person needs to type and reduce the number of entry errors and queries that produce too many or too few results.

How It Works

When the person enters one or more characters into the search field, the system shows an additional “suggestions layer” that contains one or more possible keyword combinations that in some way correspond to what the person has entered. At any point, the person has the option to keep typing or select one of the system suggestions.

Strictly speaking, Auto-Completeuses a part of the query the person typed in as a seed to providing suggestions (so that the suggestions include the original keyword or fragment). This does not always work perfectly on a mobile device because many times a small fragment contains fat-fingered misspellings. That’s where Auto-Suggestcomes in.

Auto-Suggesthas more “freedom of movement” than Auto-Complete, providing keywords and queries that include

  • Spelling corrections
  • Controlled vocabulary keyword substitutions
  • Synonyms of what the person originally typed in, query expansions, and so on

The suggestions work best when they are a clever combination of Auto-Suggest and Auto-Complete, with the system drawing the best ideas from multiple sources.

Example

Google Android search is a great example of the combined pattern, splitting the suggestions layer into two sections: first providing three auto-complete ideas and then auto-suggesting some contacts and apps that can be found on the phone (see Figure 7-4).

Figure 7-4: The Auto-Complete and Auto-Suggest patterns work in tandem in Android 4.0 Native Search.

c07f004.tif

When and Where to Use It

Any time there is a keyword query entry box, Auto-Suggest and Auto-Complete are both great patterns to implement. As search expert Marti Hearst reports in her book, Search User Interfaces (Cambridge University Press, 2009) These features generally rate great on usability and work well with other user interface (UI) patterns.

Why Use It

For most people, typing—especially on the mobile device—is tedious and prone to errors. Generally, the less typing you do on the phone, the better. Therefore, any UX pattern that can assist a person in entering information is a big win.

Auto-Complete and Auto-Suggest help reduce errors and increase satisfaction in multiple ways:

  • Reduce spelling errors: By reducing the total keys that need to be pressed, the system reduces errors associated with simply fat-fingering incorrect keys.
  • Improve query specificity: If the suggestion includes more keywords than the user was originally planning to enter, the customer often picks the more specific query, which ultimately makes her happier by providing her with the inspiration to enter “Nike Shoes” instead of just “Nike.”
  • Reduce zero results: Often queries are spelled correctly but include incorrect or conflicting keywords, which produce bad results or no results. Giving the person appropriate suggestions before she finishes entering the entire query often allows the system to forestall zero-results conditions before they occur. If the person starts typing Harry and the system displays a suggestion “Harry Potter and the Deathly Hollows,” the person is less likely to paint herself into a corner with an incorrect query such as “Harry Potter and the Sleepy Hollows.”

Other Uses

Auto-Complete and Auto-Suggest can draw from many other resources to improve the quality of the suggestions:

  • Local: Mobile phone use cases are unique, because these devices are used literally “on the move.” Thus auto-suggestions must take into account local results (obtained via on-board GPS or wireless signal triangulation) whenever possible. For example, depending on the app and use case, a query such as “Coffee” could easily include auto-suggestions such as “French Roast Coffee” for online purchases and one or two nearby coffee shops.
  • History: The Auto-Suggest pattern need not always make use of the Internet connection. One of the most important mobile User Experience (UX) patterns is Re-engagement, which means picking up the previous task after an interruption of some sort (phone call, text, needing to find directions, and so on). Therefore, one of the most important functions of Auto-Suggest is to recall previous queries (History) that can be stored locally on the device using the on-board app databases (refer to the Browse and History patterns in Chapter 6).
  • Voice Search: At the time of this writing, voice queries typically do not produce auto-suggestions. This is surprising because voice recognition is often worse than the fat-finger typing recognition. Voice query Auto-Suggest is one interesting application to look into, particularly for nondriving applications in which the system needs to display only the suggestions on the screen rather than reading them back to the user.
  • Jump into other apps: One common use case, especially with heavily networked apps, is the need to open another app to accomplish a task. Auto-Suggest can help providing a one-touch solution to shorten the task considerably. One example could be a “gas station” query on Yelp. One auto-suggestion could be “Directions to the nearest gas station”—a highly relevant use case when you are about to run out of gas. Hitting this auto-suggestion would jump directly into the Google Maps app to display directions. This kind of suggestion is also highly relevant to Voice Search use cases because it could provide a full-circle voice interaction with the system reading off directions to complete the experience. Other ideas could include providing suggestions that jump directly into the MP3 player if the query is “Like a Rolling Stone,” or into a book reader to see a sample of War and Peace. Note that it helps the customer to understand what will happen if auto-suggestions that jump directly into other apps include an icon of that app somewhere in the auto-suggestion row.

Pet Shop Application

With the variety of common names for dog breeds (and difficulty of spelling them) it’s easy to envision a useful combination of the Auto-Suggest and Auto-Complete layer for the Pet Shop app, as shown in Figure 7-5.

Figure 7-5: This wireframe is for a useful combination of Auto-Complete and Auto-Suggest in the Pet Shop App.

c07f005.tif

In this simple example, the person types in Mas, and the suggestions layer presents the Auto-Complete options Massive and Mastiff as possible query completions, thereby forestalling the common misspelling Mastif, which would have likely resulted in zero results. In the same suggestions layer, Auto-Suggest also kicks in with English Mastiff, Neapolitan Mastiff, and an interesting keyword variation Bullmastiff, a popular Mastiff breed that the person may not have thought of using as a query.

Mastiff is also a generally accepted synonym for a query “large guard dog,” so the auto-suggest layer can expand the original query by suggesting a category Guard Dogs, which can expand into a number of related breeds the person might not have thought of originally, such as Doberman, Rottweiler, American Bulldog, and so on. Both Auto-Suggest and Auto-Complete automatically scope the suggestions using a controlled vocabulary with a preset list of recommended search terms that match common tasks the app supports.

Tablet Apps

Tablet auto-suggestions represent a different use case from auto-suggestions on mobile devices. In principle, large tablets do support mobile activities; in practice, the mobility pattern for a typical consumer large tablet device is found in the area between the refrigerator and the couch, as user researcher Marijke Rijsberman explains in her perspective “A Fine Line: The iPad As a Portable Device,”which is in my first book Designing Search (Wiley, 2011). Simply put, it is more common for large tablets to be used as casual, “lean back” devices.

Typing on large tablets is easier and less error prone, so they are closer to desktops and can use the same auto-suggestion database as a desktop web application. Also, one-tap auto-suggestions that jump directly to a different app are not as important on large tablets as they are on mobile devices because people on tablets are typically not in as much of a hurry and are less likely to mind a few additional taps, as long as it’s clear that they are progressing toward their goal. Also, local results are generally not as important as they are on mobile devices; however, they should definitely be included.

Note that this does not necessarily apply to mid-size 7-inch tablets and note-tablet hybrids (refer to Chapter 3, “Android Fragmentation”). These smaller tablet devices are at once more mobile and harder to type on than their large counterparts. For the purposes of this pattern, these smaller tablet devices can be treated as mobile phones, and you should design for them accordingly.

Finally, another consideration is the interface element. In mobile devices, the auto-suggestions layer often occupies the entire page, whereas on a tablet auto-suggestions are presented in a popover layer occupying only a small part of the screen. (For more on tablet design patterns, see Chapter 14, “Tablet Patterns.”)

bang.ai Caution

If you do provide a custom auto-suggest layer (which is highly recommended) remember to turn off the device’s auto-suggest feature.

Remember that mobile phones are a different class of device. They may require a completely different auto-suggest approach (one of such mobile-only approaches is described in the next pattern 7.3 “Tap-Ahead”). Mobile Auto-Suggestions are prioritized differently because they are meant to respond to different needs. Mobile devices need to give higher weight to auto-suggestions based on on-board sensors that are only available on mobile devices. For example, local auto-suggestions, previous mobile search history and category browsing (for example, Guard Dogs, as described in the Pet Shop example) need to be higher on the list than typical desktop web auto-suggest options, which are mainly controlled vocabulary substitutions.

People misspell things differently on desktop web and tablets with full keyboards than on smaller mobile devices. Mobile misspellings mainly arise due to fat-fingering, not from common spelling misconceptions. This dictates ideally using and maintaining a different database for mobile auto-corrections that take the unique nature of mobile keyboards into account.

Related Patterns

7.3 Pattern: Tap-Ahead
7.1 Pattern: Voice Search

7.3 Pattern: Tap-Ahead

Tap-Ahead implements auto-suggest one word at a time, through step-wise refinement, creating a kind of keyword browsing.

How It Works

Instead of trying to guess the entire query the customer is trying to type at the outset and offer the best one-shot replacement the way desktop web does, Tap-Ahead on mobile devices guides the auto-suggest interface through the guessing process one phrase or keyword at a time.

This is how it works: When the searcher enters a few characters, the auto-suggest function offers a few query suggestions. At this point the searcher has two choices:

  • Tap the query if it is a sufficiently good match for what she is looking for.
  • Tap the diagonal arrow on the right side of the screen to populate the search box with the query keywords, and execute the auto-suggest function again.

By giving the searcher the ability to “build” the query instead of typing it, the interface offers a much more natural, flexible, and robust auto-suggest method that’s optimized to solve low bandwidth and fat-finger issues people experience on mobile devices. Using the Tap-Ahead interface, customers can quickly access thousands of popular search term combinations by typing just a few initial characters.

Example

An excellent example of this pattern is the Android native search (see Figure 7-6). As you can see from the following example, the Tap-Aheadpattern offers an excellent alternative to typing longer multi-keyword queries.

Figure 7-6: The Android 4.0 Native Search includes an implementation of the Tap-Ahead pattern.

c07f006.eps

In this case, by tapping the diagonal Tap-Ahead arrow, the searcher could enter a complex query “Harry Potter spells app” by typing only four initial characters (harr) and tapping the diagonal arrow two times. The traditional one-shot auto-suggest interface is unlikely to be able to offer this entire fairly unusual phrase as an auto-suggestion, so the customer is likely to have to type most, if not all, of the 23 characters of the query Harry Potter Spells app.

When and Where to Use It

Use the Tap-Ahead pattern anywhere the auto-suggest is used outside a one-shot controlled vocabulary auto-suggestion and where longer, multistep, multi-keyword queries offer an advantage and create a better set of results.

Why Use It

In contrast to desktop web search, auto-suggest on mobile devices is subject to two unique limitations: It’s harder to type on a mobile device and signal strength is unreliable. Tap-Aheadsolves both issues in an elegant, minimalist, and authentically mobile way. Tap-Ahead enables the mobile auto-suggest interface to maintain flow and increase speed and responsiveness on tiny screens that is simply not possible to currently achieve with the traditional one-shot auto-suggestion interface.

Is there evidence of this? The author’s field research shows that in mobile environments people often select search suggestions they do not need, just to save typing in a few characters. (Read more about this in “Mobile Auto-Suggest on Steroids: Tap-Ahead Design Pattern,” Smashing Magazine, April 27th, 2011, http://www.smashingmagazine.com/2011/04/27/tap-ahead-design-pattern-mobile-auto-suggest-on-steroids/). Tap ahead effectively resolves this issue.

Other Uses

For the few years that the Android platform has been around, the keyword suggestions have evolved from being an exact match to Google’s web suggestions to being its own mobile-specific set. Yet you can do even better in your own app by using a simple trick: Offer Tap-Ahead one keyword at a time.

The advantage of the one-word-at-a-time Tap-Aheadrefinement interface is that the refinement keywords can be loaded asynchronously for each of the 10 auto-suggestions while the customer makes the selection of the first keyword. Given that most queries are between two and three keywords long, and each successive auto-suggest layer offers 10 additional keyword suggestions, Tap-Ahead with step-wise refinement enables customers to reach between 100 (10 * 10) and 1,000 (10 * 10 * 10) of the top keywords through typing only a few initial characters.

Anecdotally, although Tap-Ahead is useful, few people have discovered its power to cut through tediousness and all the fat-finger mistakes associated with typing. By offering keywords one at a time, the interface is optimized for the Tap-Ahead pattern, so discovery should increase, thereby also increasing the satisfaction. Tap-Aheadone word at a time is an excellent variation of the Tap-Ahead for e-commerce apps.

Pet Shop Application

It’s easy to imagine Tap-Ahead being useful in entering complex keyword queries. However, it’s not as important with dog breeds, for example, which form a controlled vocabulary. There is scant advantage to provide a Tap-Ahead expansion from Mas to Mastiff to Neapolitan Mastiff because there are not many queries that start with Mastiff. Instead, a simple, traditional one-shot controlled vocabulary auto-suggestion (Mas directly to Neapolitan Mastiff) is a more useful approach because it not only allows the user to pick up standard keyword queries such as English Mastiff and Neapolitan Mastiff but also an interesting keyword variation Bullmastiff and category expansion Guard Dogs (see the “7.2 Pattern: Auto-Complete and Auto-Suggest” section).

Tablet Apps

The owners of large tablets are generally more willing to type a longer query, and low bandwidth is usually less of a problem for them (many tablets are used with Wi-Fi only). Nevertheless, Tap-Aheadis no less useful on tablets, where less work is perceived as a good thing and tapping a suggestion is as easy as tapping the next character on the touch keyboard. There is also early evidence that tablet queries are slightly longer, which also speaks in favor of keyword browsing.

bang.ai Caution

The best auto-suggestions on a mobile device come from a database that’s different and distinct from the web auto-suggestions database. This is especially true for Tap-Ahead implemented one keyword at a time—but that’s how important this function is to creating an excellent search experience!

At this point it’s not clear who, if anyone, holds a patent on this functionality. Google began using it first in its general device search and Google App for iPhone; although it is not used for single keyword browsing as of the time of this writing. Microsoft and Apple are both likely actively pursuing similar patents.

Related Patterns

7.2 Pattern: Auto-Suggest and Auto-Complete

7.4 Pattern: Pull to Refresh

Search results are refreshed when the customer swipes down (pulls down) on the results. Slick and convenient, this is a great pattern to refresh results that update frequently.

How It Works

The customer is presented with a long list of updates, typically sorted by Time: Most Recent First. The customer typically reviews the list of updates starting at the top, reading the most-recent messages first. When the customer wants to load newer updates, he pulls down on the results list, performing a scroll-up function. Typically, a watermark appears that lets the customer know that when he pulls down and then releases the list, it will update. The system issues an update call, which is reflected by a visible timer, followed by loading of the updated results.

Example

A great example of this pattern is the original application that helped popularize it: the Twitter mobile app. (See Figure 7-7.)

Figure 7-7: The Pull to Refresh pattern was popularized in the Twitter app.

c07f007.eps

When and Where to Use It

Use Pull to Refresh for long lists of search results or updates sorted by Time: Most Recent First. This pattern is especially useful for social update streams, active inboxes, and other long lists that update frequently.

Why Use It

The Pull to Refresh pattern uses a gesture instead of a button, which is always an excellent idea if you can communicate the needed gesture in an obvious and unobtrusive way. For Pull to Refresh, the gesture needed is the one the customer already uses to scroll the results up, so the call to action naturally “dissolves in behavior.”

When the customer first loads the results, he typically engages with the list by scanning or reading the newest updates or search results first, starting at the top, and scrolling down the list to read or scan more. When the customer reads far enough down and wants fresher results, he naturally scrolls to the top and keeps scrolling until he reaches the top of the currently loaded results and scrolls past the top of the list. At that point he sees the watermark telling him what to do to load the newest results. This often happens naturally and in the state of flow, when the customer flicks rapidly to scroll the results quickly.

One other point makes this pattern feel natural. The action to pull down on the list “pulls” new data from the server, which is an excellent fit to the customer’s existing mental model. This is a fine example of using unique capabilities of mobile and tablet touch devices to expand on the desktop web model of buttons and links.

Other Uses

Most applications of this pattern deal with search results or updates sorted by Time: Most Recent First. Another possible application might be triggered by transversing space instead of time. For example, if your customer looks for points of interest around him as he moves through a city, he has a different set of attractions within walking distance as he moves. Depending on the specific goal of the interaction, you can use the Pull to Refresh pattern to show the search results list sorted by Distance: Nearest First. This way, as the customer moves through the city, he can have an updated list of points of interest around him with a flick of a finger.

Pet Shop Application

One possible way to use Pull to Refresh in the Pet Shop app is to show updates of lost pets. If your pet is lost, for example, you can stay on top of the search with the updates page that tracks found pets in your neighborhood by periodically pulling to refresh the list. However, forcing the customer to do this may be a stressful activity if the list keeps coming up empty or static. If the list is mostly static, instead consider using some sort of a push alert (an alert that is loaded on the device and shown automatically, as opposed to being triggered by some action the customer explicitly needs to take) that notifies the customer when a new pet is found. To create a push alert a polling technology is frequently used, but from the standpoint of the customer, the alert is being “pushed” to him.

Tablet Apps

Pull to Refreshworks just as well on medium- and large-size tablets as it does on mobile phones. The vertical space needed to communicate Pull to Refresh should grow proportionally to the size of the device and the extent of a gesture needed to scroll the results. Larger tablets require longer, more sweeping gestures with which to execute the “pull.”

bang.ai Caution

Although it’s tempting to use it due to the pattern’s sheer coolness, Pull to Refresh is not recommended for the majority of search results that deal with mostly static content. It is simply not satisfying to execute a pull and release and get the same data, and the watermark on the top of the list becomes chart-junk—a useless distraction. Other counter-indications of Pull to Refresh is for lists sorted in ways that do not lend themselves to rapidly updated content, such as Best Match, Price, and so on.

Here’s another thing to keep in mind: The Pull to Refreshpattern is patented. That’s right; Twitter currently holds the patent on this design. Although it’s unlikely that Twitter would go after anyone other than a direct competitor using this pattern, it’s an important caveat to keep in mind if you plan to use it in your app.

Related Patterns

None

7.5 Pattern: Search from Menu

Search is an option that can be accessed from the navigation bar menu.

How It Works

To do the search, the user must tap the menu button in the phone’s navigation bar (that also houses the Back, Home, and Recents buttons) and then select the Search option. After Search has been tapped, the resulting page may show one or more of the following: saved searches, search refinement options, popular searches, nearby locations, and so on.

Example

In the Amazon app (see Figure 7-8), the customer accesses the search feature by tapping the magnifying glass in the menu located in the navigation bar.

Figure 7-8: The Amazon app uses the Search from Menu pattern.

c07f008.eps

The resulting Search page shows the previous query and a list of alternative query entry mechanisms, in this case a picture or a barcode that the customer can scan with an on-board camera. The menu is opened from the phone’s navigation bar, which has been dynamically modified to add the app menu function.

When and Where to Use It

Despite being used by some of today’s leading apps, this pattern is now largely deprecated. Most of the native Google apps in Android 4.0 have a dedicated Search button on the app’s action bar or in the overflow menu (see the “7.6 Pattern: Search from Action Bar” section later in this chapter). Search from Menu is a transitional pattern that can still be used for a short time (or at least until the Android 4.0 Police show up) as a way to bridge apps in older Android versions with those in Android 4.0.

Why Use It

This is a popular pattern descended from older Android OS implementations, which recommended that the app’s menu button always be present in the device’s navigation bar. This handy pattern enables the designers to hide the search along with most of the rest of the navigation, on the navigation bar, which often eliminates the need for an additional action bar. This provides the advantage of a simple interface and “taller” vertical space so that more screen space is devoted to products or content.

Other Uses

Some older Android implementations, most notably those on the Motorola and LG hardware, provide a special dedicated hardware accelerator button for search. Tapping this button is the equivalent of tapping the menu button in the navigation bar and selecting Search from that menu.

This dedicated Search button has been removed from the latest hardware designed to run Android 4.0. You can speculate as to what this means long term, but in the immediate Android future, Search from Menu and Search from Action Bar search design patterns appear to take precedence over the dedicated hardware button.

Pet Shop Application

In implementing this pattern with the Pet Shop app, there are two options of what to put on the Search page. One option is to provide alternative input methods (refer to the Amazon app shown in Figure 7-8). Other popular options include previous searches and search refinements, such as filtering or sorting. Figure 7-9 shows previous searches.

Figure 7-9: See the Search from Menu with previous searches in the Pet Shop app.

c07f009.eps

When showing the alternative query entry mechanisms such as barcode scan, picture, voice, NFC, and so on, recent previous searches can be shown as a grouped button (Recent Searches); although, this is generally less effective than actually listing previous queries in the list. Whatever strategy you decide to use, be sure to highlight (select) the current query as shown or provide an X or Clear button for the searcher so that starting a new search is easy.

Tablet Apps

Tablets do not generally need to use this pattern because there is plenty of room to install a dedicated search box or use the Search from Action Bar patterninstead.

Also, the Search from Menu pattern is ergonomically inferior to most other tablet patterns of search implementation because the menu button moves around constantly. In portrait mode the tablet’s navigation bar is on the bottom of the device, which makes it generally awkward to access a menu from a normal tablet viewing position. (Read more about ergonomics in Chapter 3, “Android Fragmentation”.)

bang.ai Caution

In addition to this pattern being deprecated in Android 4.0, using Search from Menu can lead to an awkward separation of the keyword query from the refinement tools. See the “7.9 Antipattern: Separate Search and Refinement” section.

Related Patterns

7.6 Pattern: Search from Action Bar
8.4 Pattern: Parallel Architecture
7.9 Antipattern: Separate Search and Refinement

7.6 Pattern: Search from Action Bar

The customer can access search via a dedicated button on the app’s action bar.

How It Works

The Search button (usually styled as a standard Android magnifying glass icon) is shown on the top or bottom action bar. After the user taps Search, the resulting page shows one or more of the following: saved searches, search refinement options, popular searches, nearby locations, and so on.

Example

Google Plus offers an excellent example of this pattern (see Figure 7-10).

Figure 7-10: The Google Plus app uses the Search from Action Bar pattern at the top of the app.

c07f010.eps

Google Plus offers a dedicated Search button on the top action bar. Tapping the Search button navigates the user to the dedicated tabbed searchpage, with two search subdomains, Posts and People, displayed as tabs. Tabs are a common pattern in search, as discussed in Chapter 9, “Avoiding Missing or Undesirable Results.”

Another example of the dedicated Search button in the action bar is in the Android Messaging app.

In the Messaging app, the Search button is in the middle of the split action bar, which is at the bottom of the screen. Inconsistent? Sure. But relative freedom of placement of controls on the screen is a large part of the Android DNA (refer to Chapter 2, “What Makes Android Different”).

When and Where to Use It

Any time you have an action bar in your app that has some space on it and search is important to your customers, this pattern is a great choice. Ergonomically, placing the Search button on the bottom of the split action bar makes it easier to access the function one-handed.

Why Use It

Although I am not aware of any official standing on the matter, it seems that the Google Android team has made a real effort to generally replace the Search from Menu pattern with the Search from Action Bar pattern, at least in native Google apps in Android 4.0. This is a strong signal that search remains important at Google. If search is likewise important to you, this pattern is an excellent choice and is now more or less “official” (to the extent that anything in Android can be considered official).

Other Uses

When the app’s screen real estate shrinks due to the size of hardware that runs it, some action bar functions may move into the overflow menu, as discussed in Chapter 1, “Design for Android: A Case Study.” In this case, the search function shown on the action bar might be forced into an overflow menu as well. To access the search function, the customer will have to tap the overflow menu and select Search—pretty straightforward.

Pet Shop Application

Contrast the Search from Action Barpattern shown in Figure 7-12 with the Search From Menupattern referred to in Figure 7-9.

Figure 7-12: Check out the Search from Action Bar in the Pet Shop app.

c07f012.eps

Both patterns enable access to the search page from anywhere in the application and use the same search page design. However, with the Search from Action Bar pattern, getting to the search page is accomplished via a single tap on the dedicated Search button on the App bar rather than in the two taps required by the Search from Menupattern. Search from Action Barsaves an extra tap and surfaces the search much more prominently in the mind of the customer. There is a drawback, however; using this pattern adds an action bar, which takes away precious pixels from the vertical space available for viewing content and products.

Tablet Apps

This is the standard search pattern to use in tablet apps. However, if you use the standard top action bar layout that places the search icon somewhere close to the middle of the action bar (refer to the Messaging app in Figure 7-11), your customers may get a severe case of what Josh Clark has dubbed “Tablet Elbow” if they must tap this button often (read more in Chapter 3). A better placement of this button is on the right or left nav bars, which run vertically along the edges of the device (see Chapter 14 to find out more about tablet-specific patterns).

Figure 7-11: The Android Messaging app includes the Search from Action Bar pattern in the split action bar at the bottom of the screen.

c07f011.eps

bang.ai Caution

Similar to Search from Menu, Search from Action Barcan also lead to an awkward separation of the keyword query from the refinement tools. See the “7.9 Antipattern: Separate Search and Refinement” section.

Related Patterns

7.5 Pattern: Search from Menu
8.5 Pattern: Tabs
7.9 Antipattern: Separate Search and Refinement

7.7 Pattern: Dedicated Search

The search box is placed on top of the search results and does not scroll with them.

How It Works

The search box sits on top of the search results, which enables customers to easily edit and fine-tune the keyword query. Often, a refinement (filter) button is placed to the left or right of the search box.

Example

A great example of this pattern is Yelp, as shown in Figure 7-13.

Figure 7-13: The Yelp app includes a good example of the Dedicated Search pattern.

c07f013.eps

The dedicated search box in Yelp sits on top of the search results and does not scroll when the search results are scrolled. In addition, search tools, such as Filter and Map, are located on the same line as the search box.

When and Where to Use It

For apps in which search is a key part of functionality, the Dedicated Searchpattern is an excellent choice. The Dedicated Searchpattern shows clearly what keyword query yielded the search results and provides convenient, dedicated tools to change the query and access other refinements.

Why Use It

As Peter Morville and Jeff Callender so eloquently stated in their book Search Patterns (O’Reilly, 2010), “What we find changes what we seek.” Nowhere is this statement truer than in the mobile space, where typing is awkward and people are highly distracted with multitasking. People prefer to start general and refine rapidly, and changes to the keyword query are part of that refinement. The Dedicated Searchpattern addresses the need with unmatched simplicity and elegance. The original keywords that the searcher types are always visible on top of the results and are retained in the search box for easy editing.

Other Uses

If additional filters and sort options are used with the keyword query, the Dedicated Search pattern combines well with the Filter Strippattern that shows filters and query refinements (see Chapter 8, “Sorting and Filtering”). Together, these two patterns show the searcher the entire contents of a complex query.

Pet Shop Application

Figure 7-14 shows the implementation of the Dedicated Searchpattern.

Figure 7-14: This is how the Dedicated Search pattern looks when used in the Pet Shop app.

c07f014.tif

This is a fantastic pattern for the Pet Shop app if you expect customers to edit their queries often.

Tablet Apps

Tablets are much less screen space–challenged than mobile phones. For most apps that use search, having a dedicated search box is an excellent idea. Simply having a dedicated search box on top of every page in the app implements the Dedicated Searchpattern nicely.

bang.ai Caution

Having a dedicated search box on top of the page does not mean that you need to give up the person’s history of previous searches or auto-correct functionality. Remember that previous searches can be easily presented via a layer under the search box (refer to the “7.2 Pattern: Auto-Complete and Auto-Suggest”).

On smaller devices this pattern takes up a fair bit of vertical space (20 to 30 percent of the total screen space), which significantly reduces the number of products or the amount of content that can be shown to the customer. The Dedicated Search pattern is akin to reducing the number of books that can be shown on a bookstore shelf because of the giant sign that tells you the name of the section. It’s not always a bad thing, but it is something to keep firmly in mind.

Related Patterns

8.3 Pattern: Filter Strip
7.2 Pattern: Auto-Complete and Auto-Suggest

7.8 Pattern: Search in the Content Page

The search box is on top of the search results and part of the content page, so it scrolls with the rest of the content. This pattern is an alternative of the Dedicated Search pattern.

How It Works

The basic premise of this pattern is that the search box is part of the content page. When the page first loads, the search box is shown to the customer. As the customer scrolls the content page down, the search box simply scrolls out of view with the rest of the content. To search, the customer must scroll back to the top of the page.

Example

The Twitter app makes an effort to have a consistent interface on iOS and Android, which makes it a good example of the Search in the Content Pagepattern (see Figure 7-15).

Figure 7-15: This is how the Twitter app uses the Search in the Content Page pattern.

c07f015.tif

This pattern works well with the Pull to Refreshpattern described earlier.

When and Where to Use It

Any time you have a screen that is content-centric but might need to be occasionally searched, Search in the Content Pageis a great option. However, make sure that your customers want to only run keyword queries and that sort order is obvious and does not need to be changed. Ideally, people never want to have any refinement on the query because this pattern generally makes search refinement awkward.

Why Use It

This pattern is popular in iOS but is currently seldom used in Android. That’s a shame because it’s ideal for certain applications. In particular, content-centric screens such as name lists or activity streams such as updates, which are normally browsed but not searched, make great candidates for use of this pattern. The Search in the Content Pagepattern makes search easily available but does not take up permanent screen space the way the Dedicated Searchpattern does.

Other Uses

One modification popular in iOS but virtually unknown in Android is Scroll to Search. When a content page loads, a search box is hidden on top of the page. Pulling the page down reveals the search box that searches within the content on the page. After the query runs, the resulting page shows the search box with the query.

Pet Shop Application

This pattern is not suitable for e-commerce because it makes refinement awkward. However, you can use it for an update stream or Pet News section, where search is likely to be infrequent and made up of keyword queries (see Figure 7-16).

Figure 7-16: The Search in the Content Page pattern appears in the Pet News section of the Pet Shop app.

c07f016.eps

Tablet Apps

This pattern is all about saving space, which makes it superfluous for tablets, which generally have enough space. However, it still has its place because it is easy to implement.

bang.ai Caution

This pattern is currently rare on the Android platform but is quite widespread on iOS. The reasons for this are not clear. One possibility is that iOS enables a quick scroll to the top of the page (which thereby “jumps” to the search box) using a single tap in the middle of the top App bar. This single tap jump to the top of the page shortcut is unavailable on Android because the top of the screen is normally occupied in the Android OS by the Notifications strip, which understands the pull-down touch gesture. This could make frequent use of search functionality in Search in the Content Pageimplementations problematic on Android because the person must deliberately scroll back to the top of the page “the long way” to reveal the search box.

For the Scroll to Search modification of this pattern described in the “Other Uses” section, the reason could be even simpler but more insidious. Although at this time I’m not aware of any limitation, Apple could be holding a patent to this pattern, so Android apps are generally prevented from using it (or it could be more popular in iOS simply from lack of screen space, which is less of a problem with larger Android devices). If you’re in doubt, use the simple version of this pattern implemented by Twitter as described in the “Example” section.

Related Patterns

7.7 Pattern: Dedicated Search
7.4 Pattern: Pull to Refresh

stop.ai 7.9 Antipattern: Separate Search and Refinement

An awkward experience results when the keyword query search box is removed by two or more taps from the other search refinements.

When and Where It Shows Up

Any time the keyword query and multiple complex refinement options are separated, you must pay attention. Although this shows up frequently on iOS, this antipattern is especially an issue on Android because of the widespread use of dedicated search pages, the result of Search from Menuand Search from Action Barpatterns.

Example

It’s easy to mess up when blindly copying successful apps and applying a slightly different paradigm. For example, the Amazon app manages to pull off using Search from Menuand a separate keyword search page successfully by using a simple filter drop-down located in-page with the rest of the content (refer to Figure 7-8).

Contrast the Amazon app search and filter scheme with that in TheFind, as shown in Figure 7-17.

stop.ai Figure 7-17: There’s an awkward separation of keyword search from the rest of the search refinements in this antipattern from TheFind.

c07f017.eps

The refinement page is a dedicated page with multiple text fields. One thing is conspicuously absent: the keyword search box. To change the keywords in the query, the user must tap the Menu button and then tap Search. This separation is completely artificial and therefore awkward, which should be avoided.

Why Avoid It

In most people’s minds, search is an iterative activity. (Recall Peter Morville’s quote, “What we find changes what we seek.”) So in the mind of searchers, there is little separation between keywords, filters, and sort options. These are all tools to find what they want. Separate Search and Refinement is an antipattern precisely because it introduces awkward separation between the keyword query and everything else. This is neither wanted nor needed. Separate Search and Refinementbreaks the association between different parts of the query and makes it difficult to find what you want and stay in the flow.

A better pattern called Parallel Architecture or any of the simple faceted search patterns covered in Chapter 8offer a more usable configuration.

Additional Considerations

Although often harder to recognize, the Separate Search and Result antipattern also occurs when search is presented in a different way on the homepage and on a separate search page. For example, TheFind app also offers a different search from the homepage shown in Figure 7-18.

stop.ai Figure 7-18: In this antipattern there are two slightly different places for a keyword search in TheFind homepage and dedicated search page.

c07f018.eps

Although it has similar search functionality at first glance (neither one have any refinements, for example), the homepage search doesn’t have the previous search’s history widget that the dedicated Search page has. This “separate homepage search” antipattern is a child of the Separate Search and Refinement antipattern. It can be daunting to customers who quickly get lost.

Unfortunately, this situation happens quite often and is much harder to recognize and prevent. Two great solutions for this issue are the Parallel Architecture pattern, where the homepage is the basic search page, and the Dedicated Searchpattern, which presents a consistent search box and functionality on the homepage and search results pages.

In general, it’s a good idea to offer the same basic search functionality every time you have the search box. If you offer history and auto-suggest in one place, do it everywhere you use the basic search box. Also, avoid having multiple places for search that differ only slightly; it makes it too easy for people to get lost and confused and abandon search altogether.

Related Patterns

8.4 Pattern: Parallel Architecture
7.5 Pattern: Search from the Menu
7.6 Pattern: Search from Action Bar
7.7 Pattern: Dedicated Search
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.190.156.93