Chapter 9

Interviews

Introduction

Interviews are one of the most frequently-used methods for understanding your users. In the broadest sense, an interview is a guided conversation in which one person seeks information from another. There are a variety of different types of interviews you can conduct, depending on your constraints and needs. They are flexible and can be used as a solo activity or in conjunction with another user research activity (e.g., following a card sort or in combination with a contextual observation).

In this chapter, we discuss how to prepare for and conduct an interview, how to analyze the data, and how to communicate the findings to your team. We spend a good deal of time concentrating on constructing good interview questions and how to interact with participants to get the best data possible. These processes are critical to a successful interview. Finally, we close this chapter with an illustrative case study about conducting interviews with children.

At a Glance

> Preparing to conduct an interview

> Conducting an interview

> Data analysis and interpretation

> Communicating the findings

Preparing to Conduct an Interview

Preparing for an interview includes selecting the type of interview to conduct, wording the questions, creating the materials, training the interviewer, and inviting observers. See Table 9.1 for a high-level timeline to follow as you prepare for an interview.

Table 9.1

Preparing for the interview session

When to completeApproximate time to completeActivity
As soon as possible1–2 weeks

 Meet with team to identify Questions

 Meet with team to develop user profile

After identification of questions and user profile1 week

 Create and distribute activity proposal

After the proposal has been agreed to by all stakeholders1–2 weeks

 Word questions appropriately and distribute to colleagues for review

After development of the questions2 weeks

 Identify and recruit users

 Assign roles for the activity (e.g., notetaker, interviewer)

 Prepare interview materials

 Acquire location

 Acquire incentives

 Prepare documentation (e.g., confidentiality agreement, consent form)

1 week before activity2 days

 Conduct pilot

 Make necessary changes to questions and procedure based on pilot

Day before interview1 hour

 Call and confirm with participant(s)

 Remind stakeholders to come and observe the activity (if appropriate)

Day of interview1 hour

 Set up location with all materials necessary

t9010

At a Glance

> Identify the objectives of the study

> Select the type of interview

> Decide now how you will analyze the data

> Write the questions

> Test your questions

> Players in your activity

> Inviting observers

> Activity materials

Identify the Objectives of the Study

When developing questions for interviews, it is easy to add more and more questions as stakeholders (i.e., product team, management, partners) think of them. That is why it is important for everyone to agree upon the purpose and objectives of the study from the beginning. These objectives should be included in your proposal to the stakeholders and signed off by all parties (see Chapter 6, “Creating a Proposal” section, page 116). As you and the stakeholders determine the type of interview to conduct and brainstorm questions for the interview, use the objectives of the study as a guide. If the type of interview suggested or the questions offered do not match the objectives that have been agreed upon, the request should be denied. This is much easier to do once the proposal has been signed off, rather than trying to get the agreement halfway through the process.

Select the Type of Interview

Interviews vary by the amount of control the interviewer places on the conversation. There are three main types of one-on-one interview:

 Unstructured

 Structured

 Semi-structured

An unstructured interview is the most similar to a normal conversation. The interviewer will begin with general goals but will allow the participant to go into each point with as much or as little detail and in the order he or she desires. The questions or topics for discussion are open-ended (see Chapter 10, “Response Format” section, page 275), so the interviewee is free to answer in narrative form (i.e., not forced to choose from a set of predetermined answers), and the topics do not have to be covered in any particular order. The interviewer is also free to deviate from the script to generate additional relevant questions based on answers given during the interview. When deviating from the script, it is important to be able to think on your feet so you can focus on getting the most important information, even if the question that will get you that information is not in the script. It is a good idea to have a list of carefully worded follow-up questions already prepared so you do not have to come up with these on the fly.

A structured interview, on the other hand, is the most controlled type because the goal is to offer each interviewee the same set of possible responses. The interview may consist primarily of closed-ended questions (see Chapter 10, “Response Format” section, page 275), where the interviewee must choose from the options provided. This presents a limit because a closed-ended question is only effective if the choices offered are complete and presented in the users’ vernacular. Furthermore, participants tend to confine their answers to the choices offered, even if the choices are not complete. Open-ended questions may be asked, but the interviewer will not delve into the participant’s responses for more details or ask questions that are not listed on the script. All possible questions and follow-ups are preplanned; therefore, the data gathered across participants are consistent. Thus, differences in results are attributable to real differences between participants rather than differences in measurement technique. This type of interview is similar to conducting a survey verbally but has the added benefit of allowing participants to explain their answers. It is used by organizations like the Census Bureau and Bureau of Labor Statistics.

Tip

When using only closed-ended questions, be sure to provide an “out” (e.g., “none of the above”). If participants have an opinion or experience different from the options offered, the participant will be forced to provide an inaccurate answer. Because closed-ended questions are more typical in surveys, you can find a discussion of the different types of closed-ended question and their uses in Chapter 10, “Response format” section on page 275.

A semi-structured interview is clearly a combination of the structured and unstructured types. The interviewer may begin with a set of questions to answer (closed-ended and open-ended) but deviate from the order and even the set of questions from time to time. The interview does not have quite the same conversational approach as an unstructured one.

When determining the type of interview to conduct, keep the data analysis and objectives of the study in mind. By first deciding the type of interview you plan to conduct, you know the type of questions you will be able to ask. As with any method, there are pros and cons to each type of interview (see Table 9.2).

Table 9.2

Comparison of the three types of interview

Interview typeType of data receivedProsCons
UnstructuredQualitative

 Rich data set

 Ability to follow up and delve deeper on any question

 Flexible

 Especially useful when you do not know what answers to expect

 Difficult to analyze

 The topics and follow-up questions may not be consistent across participants

Semi-structuredCombination

 Provides both quantitative and qualitative data

 Provides some detail and an opportunity to follow up

 Takes some additional time to analyze participants’ comments

 Not as consistent across participants as the structured interview

StructuredQuantitative

 Faster to analyze

 Questions asked are consistent across participants

 You can generally ask more questions than in an unstructured interview

 You may not understand why you got the results you did because participants were not given an opportunity to explain their choice

t0010

In Person or Mediated?

Regardless of the type of interview you select, you have the option of conducting the interviews in an increasingly large number of ways, including (listed from the least mediated to the most mediated) in person, using video conferencing/video chat, over the phone, or via text chat. Some aspects of mediated interviews are easier than in-person interviews. For example, when conducting interviews via text chat, there is no need to have a conversation transcribed, because all data are already captured as text. It is also usually more convenient for both the participant and interviewer to conduct the interviews via a mediated channel so that less time is spent on travel. However, there are also disadvantages to conducting interviews that are not face-to-face:

 Participants may end interviews on the telephone before participants in face-to-face interviews. It can be difficult to keep participants on the phone for more than 20 minutes. If you are cold-calling participants (i.e., not a prescheduled interview), the biggest challenge may be keeping the participant’s attention. If you do not have the participant in a quiet location, it is easy for his or her colleagues or children to come in during your interview for a “quick” question.

 Similarly, in all mediated interviews except video chat, both participants and the interviewer lack cues about identity and lack nonverbal communication cues. This may lead participants to take a more cautious approach to revealing sensitive information (Johnson, Hougland, & Clayton, 1989). It also means that you cannot watch the participant’s body language, facial expressions, and gestures, which can all provide important additional information.

 Phones can be perceived as impersonal, and it is more difficult to develop a rapport with the participant and engage him or her over the phone.

An additional benefit of computer-mediated interviews rather than phone-mediated interviews is that you may show artifacts to participants if appropriate. Figure 9.1 provides a checklist that can help you determine which type of interview is most appropriate.

f09-01-9780128002322
Figure 9.1 Checklist to determine whether a face-to-face or mediated interview is most appropriate for your study.

Tip

Aim for one-hour interview sessions. If needed, two-hour sessions may be used. We do not recommend conducting interview sessions beyond two hours. Dedicated participants or customers may be willing to stay for longer interviews if you provide regular breaks and keep the session interesting.

Decide Now How You Will Analyze the Data

Whether you are asking closed-ended or open-ended questions, there are tools and methods available to help you analyze your data. For a discussion of how to analyze open-ended questions, see “Data Analysis and Interpretation” section on page 252. For a discussion of how to analyze closed-ended questions, refer to Chapter 10 on page 266.

However you choose to analyze your data, you should analyze some sample data from your pilot interviews to ensure that your analysis strategy is achieving the results you want and to help you understand whether you need to adjust your questions (see Chapter 6, “Piloting Your Activity” section, page 155). If you plan to use a new tool, be sure to allow time to purchase any necessary software and learn how to use it.

Identify the Questions

It is now time to identify all the questions to ask. You may want to do initial brainstorming with all stakeholders (members of the product team, UX, marketing). You will likely end up with questions that are out of the scope of the proposed activity or questions that are best answered by other activities. The questions that are out of the scope should be discarded, and the ones that are better answered by other activities should be put aside until the other activity can be performed. Even with limiting your questions in this way, you may still end up with more questions than you could possibly cover in a single interview. If this is the case, you will either need to conduct multiple sessions or focus on the most important topics, so you can trim down the number of questions to fit into a single session.

Write the Questions

Once you have a list of topics you are interested in, now is the time to write a first draft of the questions. One way to start this process is by making sure you have questions that fit each category of question necessary for the interview flow. Table 9.3 presents an overview of an idealized interview flow. The next step is to word the questions so that they are clear, understandable, and impartial. In the next section, we describe how to write good questions and provide examples and counter-examples.

Table 9.3

Idealized interview flow

t09-01-9780128002322

t0015

u09-01-9780128002322
© The New Yorker Collection 1989 George Price fromcartoonbank.com. All rights reserved

Brevity

Questions should be kept short—usually 20 words or less. It is difficult for people to remember long or multiple-part questions. Break long, complex questions into two or more simple questions.

Wrong: “If you were waiting until the last minute to book a plane ticket to save money and the only seats available were on the red-eye flight or had two layovers, what would you do?”

Right: “If you were paying for the ticket on a four-hour airplane trip, would you take a late-night/dawn-arrival flight that cost half as much? < answer > Would you accept a change of planes with a two-hour delay? < answer > What if you saved a quarter of the direct-flight fare?”

Clarity

Avoid double-barreled questions that address more than one issue at a time. Introducing multiple issues in a question can be confusing. The example below addresses multiple issues: the frequency of traveling, the frequency of booking travel online, and the purpose for booking travel online. These should be asked separately.

Wrong: “Do you regularly book your travel online to save money?”

Right: “How often do you travel? < answer > What proportion of that do you book online? < answer > Why do you book travel online?”

Vague questions, too, can cause difficulty during an interview. Avoid imprecise words like “rarely,” “sometimes,” “usually,” “few,” “some,” and “most.” Individuals can interpret these terms in different ways, affecting their answers and your interpretation of the results.

Wrong: “Do you usually purchase plane tickets online?”

Right: “How often do you purchase plane tickets online?”

A final challenge to creating clear questions is the use of double negatives. Just as the name implies, double negatives insert two negatives into the sentence, making it difficult for the interviewee to understand the true meaning of the question.

Wrong: “Do you no longer book travel on the TravelMyWay app because they do not offer travel rewards?”

Right: “Tell me a little bit more about why you stopped using TravelMyWay app? < answer > What was the main reason?”

Avoiding Bias

As we mentioned earlier, there are a number of ways in which you might introduce bias into questions. One way is with leading questions. These questions assume the answer and may pass judgment on the interviewee. They are designed to influence a participant’s answers rather than elicit the participant’s true opinion.

Wrong: “Most of our users prefer the new look and feel of our app over the old one. How do you feel?”

Right: “How do you feel about the visual appearance of this app?”

Wrong: “Would you agree that it is more convenient to book travel using an app rather than through a travel agent?”

Right: “What is the greatest advantage to you of booking travel using an app? < answer > What is the greatest disadvantage? < answer > How does booking using an app compare to booking through a travel agent?”

Leading questions are rather obvious and easy to pick up on. Loaded questions are subtler in their influence. They typically provide a “reason” for a problem in the question. One example where loaded questions may be observed is in political campaigns. Politicians may use these type of questions in an attempt to demonstrate that a majority of the population feels one way or another on a key issue.

Wrong: “The cost of airline tickets continues to go up to cover security costs. Do you think you should have to pay more when you buy your ticket, or should the government start paying more of the cost?”

Right: “How much are you willing to pay for your plane ticket to cover additional security at the airport?”

The question above suggests that the reason for increasing travel costs is increased security costs. It is clear how the interviewer in the first (wrong) question would like the participant to answer. This is also an example of a question based on a false premise. The example implies that the government has not paid for additional security costs and should now start to do so. These types of questions begin with a hypothesis or assumption that may not be fully accurate or can be easily misinterpreted. Not only is this type of question unethical, but also the data you get in the end are not valid.

The final type of bias is interviewer prestige bias. In this case, the interviewer informs the interviewee that an authority figure feels one way or another about a topic and then asks the participant how he or she feels.

Wrong: “Safety experts recommend using a travel agent instead of booking your travel via an app. Do you feel safe using travel apps?”

Right: “Do you feel that booking travel via an app is more or less confidential than booking through a travel agent?”

Focus on Outcomes

Rather than asking your participants for solutions to problems or to identify a specific product feature that they might like, ask participants questions that help you ascertain the outcomes they need to achieve. People are often unable to know what works for them or what they may actually like until they have actual experience with a service or feature.

Wrong: “What are your thoughts about a new feature that allows you to instant message a travel agent with any questions as you book your travel?”

Right: “Would you like to correspond with a travel agent while you are booking travel? < answer > What are some ways that you would like to correspond with a travel agent while you are booking travel?”

The “wrong” question above would be a problem if the participant was unfamiliar with instant messaging. In that case, you would want to speak in broader terms and ask the participant to discuss communications methods with which he or she is familiar.

If you task customers with the responsibility of innovation, many problems can occur, such as small iterative changes rather than drastic, pioneering changes; suggestions that are limited to features available in your competitors’ products (because these are what your participants are familiar with); and features that users do not actually need or use (most users use less than 10% of a software product). Focusing on outcomes lets you get a better understanding of what users really need. With a thorough understanding of what participants need, you can allow developers, who have more technical knowledge and desire to innovate than users, to create innovative solutions that meet these needs.

To do this, the interviewer must understand the difference between outcomes and solutions and ask the appropriate probes to weed out the solutions while getting to the outcomes.

Participant: “All the flight times presented on an iPhone.” (Solution)

Interviewer: “Why would you like the flight time presented on an iPhone? What would that help you do?” (Probe for outcome)

Participant: “I travel a lot so I need to see when flight times change and I take my iPhone with me everywhere.” (Outcome)

Interviewer: “So you would like to have access to flight times from anywhere?” (Rephrasing to confirm outcome)

Participant: “Exactly.” (Confirmation)

Avoid Asking Participants to Predict the Future

Do not expect participants to be able to predict their future. Instead, focus on understanding what they want now. You will be able to translate that into what they will need in the future.

Wrong: “If we had a feature that let you communicate with all the other passengers on a plane using your smartphone, would you use it?”

Right: “Would you like to chat with other passengers while you are on a plane waiting for takeoff?”

Inaccessible Topics

You have screened your participants against the user profile, but you still may encounter those who cannot answer all of your questions. A participant may not have experience with exactly what you are asking about or may not have the factual knowledge you are seeking. In these cases, be sure a participant feels comfortable saying that he or she does not know the answer or does not have an opinion. Forcing a participant to choose an answer will only frustrate the participant and introduce error into your data.

Begin the interviews by informing participants that there are no right or wrong answers—if they do not have an opinion or experience with something, they should feel free to state that. Keep in mind that interviewees are often eager to please or impress. They may feel compelled to answer and therefore make a best guess or force an opinion that they do not actually have. Encourage them to be honest and feel comfortable enough to say they cannot answer a particular question.

Depending on Memory

Think about the number of times you have booked a rental car in the last three years. Are you confident in your answer? Interviewers and surveys often ask people how frequently they have done a certain task in a given period. If the question seeks information about recent actions or highly memorable events (e.g., your wedding, college graduation), it probably will not be too difficult. Unfortunately, people are often asked about events that happened many years ago and/or those that are not memorable.

What is key is the importance or salience of the event. Some things are easily remembered because they are important or odd and therefore require little effort to remember. Other things are unmemorable—even if they happened yesterday, you would not remember them. In addition, some memories that seem real may be false. Since most participants want to be “good” interviewees, they will try hard to remember and provide an accurate answer, but memory limitations prevent it. Underreporting and overreporting of events frequently happen in these cases.

In addition to memory limitations, people have a tendency to compress time. This response bias is known as telescoping. This means that if you are asking about events that happened in the last six months, people may unintentionally include events that happened in the last nine months. Overreporting of events will result in these cases.

To help mitigate these sources of error, you should avoid questions covering unmemorable events. Focus the questions on salient and easy-to-remember events. You can also provide memory aids like a calendar and/or provide specific instructions alerting participants to potential memory confusion and encouraging them to think carefully to avoid such confusion (Krosnick, 1999). Finally, if you are truly interested in studying events over a period of time, you can contact participants in advance and ask them to track their behaviors in a diary (see Chapter 8, “Incident Diaries” section, page 202). This means extra work for the participant, but the data you receive from those dedicated individuals who follow through will likely be far more accurate than from those who rely on memory alone.

Other Types of Wording to Avoid

Avoid emotionally laden words like “racist” and “liberal.” Personal questions should be asked only if absolutely necessary and then with great tact. This includes questions about age, race, and salary. Figure 9.2 describes how to identify questions that might be perceived as threatening.

f09-02-9780128002322
Figure 9.2 Checklist to determine whether a question might be perceived as threatening (Alreck & Settle, 1995).

Jargon, slang, abbreviations, and geek speak should be avoided unless you are certain that your user population is familiar with this terminology. Speaking plainly and in the user’s language (as long as you understand it) is important. And of course, take different cultures and languages into consideration when wording your questions (see Chapter 6, “Recruiting International Participants” section, page 147). A direct word-for-word translation can result in embarrassing, confusing, or misinterpreted questions. Figure 9.3 provides a checklist of dos and don’ts in question wording.

f09-03-9780128002322
Figure 9.3 Dos and don’ts in question wording.

Test Your Questions

It is important to test your questions to ensure that you are covering all the topics you mean to, to identify questions that are difficult to understand, and to ensure that questions will be interpreted as you intend. If you are conducting an unstructured interview, test your list of topics. If you are conducting a structured interview, test each of your questions and follow-up prompts. Begin with members of your team who have not worked on the interview so far. The team members should be able to summarize the type of information you are looking for in a question. If the colleague incorrectly states the information you are seeking, you need to clarify or reword the question. Find someone who understands survey and interview techniques to check your questions for bias. If no one in your organization has experience with survey and interview techniques, you may be able to ask a colleague in another organization to review your questions for bias, as long as the questions will not reveal proprietary information.

Once the questions have passed the test of colleagues, conduct the interview with a couple of actual participants. How did the participants react to the questions? Did they answer your question, or did it seem like they answered another question? Was everything clear? Were you able to complete the interviews within the allotted time frame?

Players in Your Activity

In addition to users (the “participants”), you will require three other people to run the sessions. In this section, we discuss the details of all the players involved in an interview.

The Participants

Participants should be representative of your existing end users or the users that you expect will be interacting with your system.

Number of Participants

We find in industry settings that a common sample size for interview studies is six to ten participants of each user type, with the same set of questions. However, there are several factors to take into consideration, including whether you are seeking qualitative or quantitative data and the size of the population of users. For qualitative studies, some researchers recommend that 12 participants are sufficient for discovering new themes (Guest, Bunce, & Johnson, 2006). On the other hand, other researchers suggest that thirty or more participants are needed for interviews seeking quantitative data (Green & Thorogood, 2009). See Chapter 6 on page 116 for a more thorough discussion of how many participants you should recruit for a user research activity.

The Interviewer

The task of the interviewer is to develop rapport with the interviewee, elicit responses from the participant, know what an adequate answer is to each question, examine each answer to ensure that the participant understands what he or she is really saying, and, in some cases, paraphrase the response to make sure that the intent of the statement is captured. In addition, the interviewer needs to know when to let a discussion go off the “planned path” into valuable areas of discovery and when to bring a fruitless discussion back on track. The interviewer needs to have sufficient domain knowledge to know which discussions are adding value and which are sapping valuable time. He or she also needs to know what follow-up questions to ask on the fly to get the details the team needs to make product decisions.

We cannot stress enough how important it is for interviewers to be well-trained and experienced. Without this, interviewers can unknowingly introduce bias into the questions they ask. This will cause participants to provide unrepresentative answers or to misinterpret the questions. Either way, the data you receive are inaccurate and should not be used. People new to interviewing may wish to take an interviewing workshop—in addition to reading this chapter—to better internalize the information. Another way to gain experience is to offer to serve as the notetaker for an experienced interviewer. We are always looking for an eager notetaker to help out, and you will get the opportunity to see how we handle various situations that arise.

We strongly recommend finding someone who is a skilled interviewer and asking him or her to review your questions. Then, practice interviewing him or her, and ask specifically for feedback on anything you may do that introduces bias. The more you practice this skill, the better you will become and the more confident you can be in the accuracy of the data you obtain. Co-interviewing with a skilled interviewer is always helpful.

Finally, although we all hate to watch ourselves on video or listen to audio of ourself, it is very helpful to watch/listen to yourself after an interview. Even experienced interviewers fall into bad habits—watching yourself on video can make you aware of them and help break them. Having an experienced interviewer watching or listening with you helps because he or she can point out areas for improvement. If you involve other members of the team, such as product managers, they may also be able to point out additional topics of interest or request that you focus on areas of concern. Ideally, this can be a fun, informative team-building activity.

The Notetaker

You may find it useful to have a colleague in the same room or another room who is taking notes for you. This frees you from having to take detailed notes. Instead, you can focus more of your attention on the interviewee’s body language and cues for following up. Depending on the situation, the notetaker may also act as a “second brain” who can suggest questions the primary interviewer might not think of in context (see Chapter 7, “Recording and Notetaking” section on page 171, for more details on capturing the data.)

The Videographer

Whenever possible, video record your interview session (see Chapter 7, “Recording and Notetaking” section on page 171 for a detailed discussion of the benefits of video recording). You will need someone to be responsible for making the recording. In most cases, this person simply needs to start and stop the recording, insert new storage devices (e.g., SD cards) as needed, and keep an eye out for any technical issues that arise. He or she can also watch audio and light levels of the recording and suggest changes as needed. Be sure to practice with the videographer in advance. Losing data because of a video error is a common, yet entirely preventable, occurrence.

Inviting Observers

As with other user research techniques, we do not recommend having observers (e.g., colleagues, product team members) in the same room as the participant during the interview. If you do not have the facilities to allow someone to observe the interviews from another room but you do want stakeholders to be present, it is best to limit the observers to one or two individuals. Any more than this will intimidate the person being interviewed. The observers should be told explicitly, prior to the session, not to interrupt the interview at any time.

Tip

Spend some time prior to the interview informing the observers about the importance of asking unbiased questions and the impact different types of bias can have on the data you collect. You may even choose to do some role-playing with the observers, having them interview you. Identify any biased questions they ask and tell them how the question would be asked without bias. Once observers understand the difficulty of asking unbiased questions and the impact biased questions have on the data, you will find that they will be much more likely to respect your request for their silence.

It is up to you if you would like to invite observers to suggest questions for the participants. All questions observers suggest should follow the same guidelines discussed above (e.g., avoid bias, keep them brief). Since you cannot control what the observers say, we recommend asking them to write questions on paper and then pass them to you at the end of the session. This works well in cases where you do not have the domain knowledge to ask and follow up on technical questions. You then have the opportunity to reword a loaded or double-barreled question. Or you can work with the observers prior to the session to devise a set of follow-up questions that they may ask if time permits. You may also permit observers to view a session remotely and use a chat feature to suggest questions to you in real time. However, this can be difficult to manage since it requires you to keep an eye on the chat feature, be able to figure out the appropriate moment during the interview to ask the question, reword it on the fly, and know when to skip asking a suggested question.

Activity Materials

You will need the following materials for an interview (the use of these will be discussed in more detail in the next section):

 Protocol

 List of questions for interview

 Method of notetaking (laptop and/or paper and pencil on clipboard)

 Method of recording (video or audio recorder)

 Comfortable location for participant, interviewer, notetaker, and video equipment

 Memory aids (e.g., calendar) (optional)

 Artifacts to show the participant (optional)

Conducting an Interview

At a Glance

> Things to be aware of when conducting interviews

> The five phases of an interview

> Your role as the interviewer

> Monitoring the relationship with the interviewee

> Dos and don’ts

Things to Be Aware of When Conducting Interviews

As with all user research activities, there are some factors that you should be aware of before you jump into the activity. In the case of interviews, these include bias and honesty.

Bias

It is easy to introduce bias into an interview. Choice of words, your way speaking, and body language can all introduce bias. Bias unfairly influences participants to answer in a way that does not accurately reflect their true feelings. Your job as an interviewer is to put aside your ideas, feelings, thoughts, and hopes about a topic and elicit those things from the participant. A skilled interviewer will word, ask, and respond to questions in a way that encourages a participant to answer truthfully and without worry of being judged. This takes practice and lots of it.

Honesty

Individuals who are hooked on performance metrics or who question the value of “anecdotal” data may frown upon interviews. Sometimes, people ask how you know a participant is telling the truth. The answer is that people are innately honest. It is an extremely rare case that a participant comes into your interview with the intention of lying to you or not providing the details you seek.

However, there are factors that can influence a participant’s desire to be completely forthcoming. Participants may provide a response that they believe is socially desirable or more acceptable rather than the truth. This is known as social desirability. Similarly, a participant may describe the way things are supposed to happen rather than the way things actually happen. For example, a participant may describe the process he or she uses at work according to recommended best practice, when, in actuality, the participant uses shortcuts and work-arounds because the “best practice” is too difficult to follow—but the participant does not want to reveal this. Make it clear that you need to understand the way he or she actually works. If work-arounds or shortcuts are used, it is helpful for you to understand this. And of course, remind the participant that all information is kept confidential—the employer will not receive a transcript of the interview.

A participant may also just agree to whatever the interviewer suggests in the belief that it is what the interviewer wants to hear. Additionally, a participant may want to impress the interviewer and therefore provide answers that increase his or her image. This is called prestige response bias. If you want the participant to provide a certain answer, he or she can likely pick up on that and oblige you. You can address these issues by being completely honest with yourself about your stake in the interview. If you understand that you have a stake in the interview and/or what your personal biases are, you can control them when writing questions. You can also word questions (see “Write the Questions” section, page 225) and respond to participants in ways that can help mitigate these issues (e.g., do not pass judgment, do not invoke authority figures). You should be a neutral evaluator at all times and encourage the participant to be completely honest with you. Be careful about raising sensitive or highly personal topics. A survey can be a better option than interviews if you are seeking information on sensitive topics. Surveys can be anonymous, but interviews are much more personal. Participants may not be forthcoming with information in person. On the other hand, if you are skilled at being a sympathetic listener and you are unafraid to ask questions that might be sensitive, interviews can be used. For more discussion on this topic, see “Asking the Tough Questions” section on page 242.

If the participant is not telling the complete truth, this will usually become apparent when you seek additional details. A skilled interviewer can identify the rare individual who is not being honest and disregard that data. When a participant is telling a story that is different from what actually happened, he or she will not be able to give you specific examples but will speak only in generalities.

Tip

With continued prodding, a dishonest participant will likely become frustrated and attempt to change the subject. If you doubt the veracity of a participant’s responses, you can always throw away the data and interview another participant. Refer to “Know When to Move On” section on page 247.

You are now prepared to conduct an interview. In this section, we walk you through the steps.

First, Table 9.4 covers in general the sequence and timing of events to conduct an interview. It is based on a one-hour session and will obviously need to be adjusted for shorter or longer sessions. These are approximate times based on our personal experience and should be used only as a guide.

Table 9.4

Timeline for a one-hour interview (approximate times)

Approximate durationProcedure
5-10 minutesIntroduction (welcome participant, complete forms, and give instructions)
3-5 minutesWarm-up (easy, nonthreatening questions)
30-45 minutesBody of the session (detailed questions)
This will vary depending on the number of questions
5-10 minutesCooling-off (summarize interview; easy questions)
5 minutesWrap-up (thank participant and escort him or her out)

Interviewing is a skill and takes practice. You should observe the five phases of an interview and monitor the interviewing relationship throughout.

The Five Phases of an Interview

Whether the interview lasts ten minutes or two hours, a good interview is conducted in phases. There are five main phases to be familiar with.

The Introduction

This should not be too long. If it is over ten minutes, you are probably giving the participant too many instructions and other things to remember. This is the first opportunity you have to encourage participants to answer honestly and feel free to say when they cannot answer one of your questions. The following is a sample introduction:

Begin by introducing yourself. If there are other observers in the room, introduce them, too.icon04-9780128002322My name is Jane Doe and I work for TravelMyWay.
Thank the participants for coming.icon04-9780128002322Thank you for coming in today! We will spend the next hour talking about your experience booking travel online.
State the purpose of the interview and why the participant has been asked to participate.icon04-9780128002322I understand that you are a current user of the TravelMyWay app and that you signed up for our Customer Participation Program. We would like to learn how we could improve our site to better suit our customers' needs.
If there are people watching in another room, mention their presence in generic terms.icon04-9780128002322We have a couple members of the product development team in another room watching this session, and
Be sure to ask permission to audio or video record the session.icon04-9780128002322—if you do not mind—I would like to make a recording.
Say why you want to record the session.icon04-9780128002322This will allow me to go back at a later time and review your comments so that I am not distracted from our conversation by taking notes.
Help the participant understand that you think of them as the expert, that you are not invested in them liking any product or company, and that you will not judge their answers.icon04-9780128002322I am not a member of the product team. I am a neutral evaluator, so nothing you say today will hurt my feelings. Your honest opinions can only help us improve our product. If you do not have an opinion or cannot answer any of the questions I ask, please feel free to say so.
Explain any confidentiality agreements and consent forms that must be signed.icon04-9780128002322Since this product is not on the market yet, you will need to sign a nondisclosure agreement in which you promise not to discuss this product with anyone until it is put on the market or until two years from now.
Make sure participants understand they are not obliged to continue the session and may leave, without consequence, at any time.icon04-9780128002322You are free to leave at any time. Please stop me at any point if you have questions.

t9805

Warm-Up

Always start an interview with easy, nonthreatening questions where you are sure you will get positive answers to ease the participant into the interview. You can confirm demographic information (e.g., occupation, company), how the participant first discovered your product, etc. You may even allow the participant to vent his or her top five likes and dislikes of your product. The participant should focus his or her thoughts on your product and forget about work, traffic, the video cameras, and so on. This is best done with easy questions that feel more like a conversation and less like a verbal survey or test. It is best to avoid asking seemingly innocuous questions like “Do you enjoy working here?” or “Tell me about a recent problem you had.” Negative questions or ones that might elicit a negative response tend to taint the rest of the interview.

Five to ten minutes may be sufficient for the warm-up, but if the participant is still clearly uncomfortable, this could be longer. However, do not waste the participant’s time (and yours) with useless small talk. The warm-up should still be focused on the topic of the interview.

Body of the Session

Here is the place where you should ask the questions you wrote and tested (see “Write the Questions” section, page 225 and “Test Your Questions” section, page 233). Present questions in a logical manner (e.g., chronological), beginning with general questions and move into more detailed ones. Avoid haphazardly jumping from one topic to another. This should be the bulk (about 80%) of your interview time with the participant.

Tip

Another reason for practicing the interview questions early on is because it is not always obvious what will be “logical” in an interview. One trick is to have someone else read the interview script aloud. It is easier to notice a non sequitur when someone else is asking the questions than it is to notice them as you read your own writing.

Cooling-Off

Your interview may have been intense with very detailed questions. At this point, you may want to pull back and ask more general questions or summarize the interview. Ask any follow-up questions in light of the entire interview. One cool-off question we like in particular is “Is there anything else I should have asked you about?” This is a great trick that will often pivot an entire interview.

Wrap-Up

You should demonstrate that the interview is now at a close. This is a good time to ask the participant whether there are any questions for you. Some people like to do this by closing a notebook and putting away their pen (if they were taking notes), changing their seat position, or turning off the audio and/or video recorder. Thank the person for his or her time.

Your Role as the Interviewer

Think of your job as an interviewer as a coach that helps participants provide the information you need. “Active listening” means that you must judge if each response has adequately addressed your question, be on the lookout for areas of deeper exploration, and monitor the interviewing relationship throughout. Interviewing is an active process because you know the information you are seeking and must coax that information from the participant. Because of this, interviewing can be an intense activity for the interviewer.

Keep on Track

It is easy for unstructured interviews to go off track. The participant may go into far more detail than you need but not know that. A participant may also digress to topics that are outside the scope of the study. It is your job to keep the participant focused on the topic at hand and move on to the next topic when you have the information needed. Below are some polite comments to get participants back on track or to move them on to a new topic:

I can tell that you have a lot of detail to provide about this, but because of our time constraints, I need to move on to a new topic. If we have time at the end, I would like to come back to this discussion.

That’s really interesting. I was wondering if we could go back to topic XYZ for a moment …

I’m sorry to stop you, but a moment ago, you were discussing XYZ. Can you tell me more about that?

Silence Is Golden

One of the most difficult skills in interviewing is patience. You never want to complete participants’ thoughts for them or put words in their mouths. Give each participant time to complete his or her thoughts. If you do not have enough information after adequate silence, then follow up with another question or restate the participant’s answer (see “Reflecting” section, page 248). Of course, if the participant is struggling with a word and you are sure you know what the participant is trying to say, offer the word or phrase the participant is searching for, especially if the participant says, “It’s on the tip of my tongue. Do you know what I’m talking about?”

Think of silence as a tool in your tool belt. An interviewee may wonder how much detail you are looking for in response to a question. In that case, he or she will likely provide a brief answer and then wait to see whether you move on. If you do not, the participant has been “given permission” to provide more detail. Counting to five before either moving on to the next question or probing for more details can provide adequate time for the participant to continue. Counting to ten can seem uncomfortable to both you and the participant but can be a useful method for coaxing a response from a reticent interviewee. Always pay attention to the participant’s body language (e.g., sitting forward, poised to make another statement) to determine whether he or she has more to say.

It is possible to go too far with your pauses and risk giving participants the silent treatment. That is why acknowledgment tokens are so important. Acknowledgment tokens are words like “oh,” “ah,” “mm hm,” and “uh huh” that carry no content. Since they are free of content, they are unobtrusive and require almost no processing by the participant, so he or she can continue unimpeded with a train of thought. These devices reassure participants that you hear them, understand what is being said, and want them to continue. Speakers expect a reaction from listeners, so acknowledgment tokens complete the “conversational loop” and keep the interviewing relationship a partnership, rather than a one-way dialog. Tokens like “mm hm” and “uh huh” are called “continuers” because they are not intrusive or directive. Tokens like “OK” and “yeah” imply agreement, which you may not want to imply to participants, revealing your personal opinions (Boren & Ramey, 2000). However, this varies hugely from culture to culture. For example, in Japanese, if an interviewer failed to say “hai” (yes) often, it would be considered very rude.

Remain Attentive

Have you had the experience where someone has been talking for the past several minutes and you have no idea what he or she has been saying? If you are tired or bored, it is easy to zone out. Obviously, this is a faux pas in any conversation but particularly problematic in an interview. If you are tired or bored, there is a good chance that the participant is, too.

Take a break at a logical stopping point. This will give you a chance to walk around and wake up. Evaluate how much you have engaged the participant in the interview. If this is an unstructured interview, you should be engaging the participant frequently for clarification, examples, and reflection. If it is a highly structured interview, the interviewee’s answers should be short, followed by your next question. In either case, you should be engaging in the interview (without interrupting the participant, of course). After the break, take a moment to ask the interviewee to briefly summarize his or her last response. This will help the interview pick up where it left off and get you back on track.

Tip

Running multiple interviews in one day may speed up the information gathering process, but it will also leave you exhausted and without adequate time to debrief with your notetaker and any other observers to discuss what you have learned. We recommend running no more than four one-hour interviews per day. Conducting more interviews per day for several days will likely leave you exhausted and degrade the quality of your data and lengthen the time it takes you to analyze those data.

If you are going from city to city conducting interviews and must conduct, for example, seven interviews in two days, you do not have much of a choice. In this case, we recommend bringing a colleague to tag-team the interviews with you. Alternating the roles of notetaker and interviewer can give you both a modest break. At least you will not have to be fully “switched on” for every interview (encouraging participants, following up on questions).

Also, be sure to allow a small break between interviews. The more time you can give yourself the better. You will need enough time to get up, stretch your legs, take a bathroom break, and grab a beverage. Allow yourself time to eat, too, because if you cram all of your interviews back to back, your energy is sure to run out. We hope it goes without saying, but never eat your lunch while interviewing a participant; that is rude and distracting.

If you have conducted several interviews on the same topic before, it is easy to assume that you have heard it all. What new information could the sixth participant provide? If you think you already know the answers to the questions, you can find yourself hearing what you want to hear or expect to hear, thereby missing new information. Every participant has something unique to provide—although it may not be significant enough to warrant additional interviews. If you have conducted several interviews and feel confident that you have gained the required information, do not recruit additional participants. However, once the participant is in the door, you owe him or her the same attention that you gave the very first participant. Keep an open mind and you will be surprised at what you can learn.

Asking the Tough Questions

To help determine if your questions are sensitive or embarrassing, run your questions by someone not connected with your project. Sometimes, you may need to ask questions that are embarrassing or cover sensitive topics. As we mentioned earlier, this may be better done via surveys, but if you think that there is a need to ask a difficult question in an interview, wait until you have developed a rapport with the participant. When you ask the question, explain why you need the information. This lets the participant know that you are asking for a legitimate reason and not just out of curiosity. The participant will be more likely to answer your question and relieve any tension. For example,

Say why you want the information and how you’ll use it.icon04-9780128002322This next question is about the range your salary falls in. I'm asking this only because we believe that people who fall within specific salary ranges are sometimes more or less likely to book their travel using an app. To help us understand this, we are asking everyone about their salary range—if they feel comfortable. Would you mind stating which of the following ranges your salary falls in?

t9025

Using Examples

No matter how hard you try to make your questions clear, a participant may still have difficulty understanding exactly what you are asking. Sometimes, rewording the question is not sufficient and an example is necessary for clarification. Since the example could introduce bias, you want to do this as a last resort. Having some canned examples for each question and then asking colleagues to check those examples for bias will help immensely.

Give the interviewee a moment to think about the question and attempt to answer it. If it is clear that the participant does not understand the question or asks for an example, provide one of the canned examples. If the participant still does not understand, you could either provide a second example or move to the next question.

Wrong: “What are some of the discount airlines, such as Jet Blue, that you prefer to travel on?”

Right: “Have you traveled on a discount (lower-fare) airline? < User does not understand what you mean by discount airlines; read a complete list of discount airlines. > If you have, which ones do you prefer to travel on?”

Watch for Generalities

Interviewees will often speak in general terms because they believe it is more useful to provide summary descriptions or typical situations rather than specific examples. This is usually the result of a generalized question (see below):

Interviewer: “Tell me what happens when you contact the agent on call.”

Participant: “When you contact the agent on call, you wait and wait and wait. They don’t pick up.”

If you are looking for specific, detailed answers, do not ask generalized questions. Ask for significant events. Since the best indicator of the present is the past, ask the interviewee to describe a particular past event that best exemplifies the situation. Keep in mind our earlier discussion about memory limitations and telescoping (see page 231). Below is a sample interview asking for a significant event:

Generalized Question

Interviewer: “What has been your experience booking travel online?”

Participant: “Oh, I always have a terrible time. Either I can’t remember my password or the session times out before I can make my decision so I have to start all over. It’s always something.”

Follow-up for a significant event

Interviewer: “Can you tell me what was particularly frustrating about the last time you tried to book travel online?”

Participant: “Well, the last time wasn’t so bad. I know my session always ends up timing out before I can finish so this time I was prepared. I did research on the flight and hotel in advance so that I could enter in all the information quickly. I logged in, selected the airline I wanted, entered in the flight dates and times, and then chose the flight that I knew I wanted. That went very quickly. I couldn’t book the hotel though. I knew which hotel I wanted but the dates I needed were not showing up as available. That annoyed me since I had already called the hotel to ask for availability. I ended up having to leave the site and just book the hotel over the phone. I didn’t get any kind of discount, so that sucked.”

Do Not Force Choices

Do not force opinions or choices from participants. If you ask an interviewee to make a choice from a list of options but he or she says that it does not matter or all of them are fine, take this as an opportunity to learn more about what the participant thinks about each option. By asking the participant to verbalize (and therefore think more about) each option, he or she may then show more of a preference for one option over others or at least help you understand why he or she feels the way he or she does. If the participant states that all options are equal, do not force him or her to make a choice. Likewise, if the participant states that he or she does not have an opinion on something, forcing him or her to elaborate will only generate annoyance (see example below).

Interviewer: “Which of the following customer rewards would you most like to receive when booking a certain number of trips with the TravelMyWay app?”

 10% discount on your next plane ticket

 Free upgrade on a car rental or plane ticket

 Free night at the hotel of your choice

 3% cash rebate

Participant may not have understood that a single response was desiredicon04-9780128002322Participant: “All of those sound good to me!”
Interviewer restates questionicon04-9780128002322Interviewer: “Do you have a preference for one over the other?”
Participant: “No. They are all about equal.”
Interviewer tries to determine if participant has a preferenceicon04-9780128002322Interviewer: “Can you tell me the pros and cons you can see with each option?”

t9800

Watch for Markers

Sometimes, participants throw out markers. These are key events to the participant that you can probe into for more rich information. You should search for more details only if you believe it will provide relevant detail to your study—and not out of curiosity. Below is an interview excerpt with a marker and appropriate follow-up:

Interviewer: “Can you tell me about a difficult time you had using the TravelMyWay app?”

Markericon04-9780128002322Participant: “Well, it was right after my aunt passed away. I needed to get a plane ticket back home quickly, but I couldn't get any on your site.”
Interviewer detects marker and seeks relevant informationicon04-9780128002322Interviewer: “You mentioned your aunt had just passed away. What made it difficult to get a plane ticket at that time?”

t9015

Participant: “I just knew I needed to get home quickly. Unfortunately, the seats would have cost a fortune since it was a last-minute travel. I heard that airlines offered bereavement discounts but I couldn’t figure out how to do that on your site. I did a search on your site but I couldn’t find anything. I was hoping that you had an on-call agent like WillCall.com, but you didn’t.”

Interviewer: “What happened next?”

Participant: “I was so stressed out that I ended up leaving your site and going to WillCall.com instead. They were able to get me the discount pretty quickly.”

The participant provided the marker of her aunt passing away. That was critical to her. She was stressed out and could not find the information she needed. She wanted someone to personally help her and provide some support, but the app could not do it. Consequently, she now has a strong negative memory of the TravelMyWay app and a positive one of the competitor. Following up on that marker allows us to better understand the context of what happened and why the experience was so difficult for the participant. If the participant drops such a marker inadvertently and does not feel comfortable elaborating on it, he or she will let you know the topic is off-limits.

Select the Right Types of Probe

Probes are questions used to get interviewees to clarify or elaborate on responses. Your probes for detail should be as unbiased as your initial question to the participant. There are closed-ended and open-ended probes, just like the initial question you asked. A closed-ended probe would be something like “Were you using Chrome or Safari?” An open-ended probe might be “Tell me about the browser(s) you use.” Keep all probes neutral and do not ask the participant to defend his or her choices.

Wrong: “Why did you do that?”

Right: “Can you tell me more about your decision?”

Table 9.5 provides an excellent comparison of different types of biased and unbiased probes and what makes each probe biased.

Table 9.5

Biased and unbiased probes

AskInstead ofWhy
Can you tell me what you are thinking right now?
What are you trying to do?
Are you thinking _____?
Are you trying to _____?
Even though you may think you know what they are thinking, your job is to get them to say it. Also, you do not want to put words into their mouths, because you may be wrong.
What are you thinking?
Can you explain to me what you are trying to do?
Why are you _____?
Are you trying to _____ because _____?
By asking participants why they are doing something, they may feel that you are asking them to justify their actions and, therefore, think that they are going about the task incorrectly.
Can you explain to me your train of thought right now?
(After the task is ended)
Why did you try to _____?
Are you trying to_____?It is, however, acceptable to ask participants why they went about a task in a certain way after the task has been ended or at the end of the test if future tasks have components similar to the task you are questioning them about.
Did you find the product easy or difficult to use?
Were the instructions clear or confusing?
Were error messages helpful or hindering?
Did you find the product easy to use?
Did you find the product difficult to use?
Were the error messages helpful?
Trying to get someone to express an opinion on a specific usability attribute is not always easy. Therefore, you may find that you need to guide participants by specifying the attribute you want them to react to. It is important to use both ends of the spectrum when you do this so that they do not perceive you as encouraging either a positive or negative answer. Also, by doing so, you will encourage a more informative response. Instead of responding “No (it was not easy),” they are more likely to say “I found it very difficult to use,” or “It was pretty easy.” You then can follow up by asking them “Why?”
What are you feeling?
How did you feel when you were doing _____?
Are you feeling confused?
Are you feeling frustrated?
Sometimes, participants need to stop and think. Though they may appear confused or frustrated, they may just be contemplating. Do not risk inserting your guess about how they are feeling.
Would you change anything about this (product, screen, design, etc.)?Do you think_____ would improve the product?Unless the design team is considering a particular design change, you should never suggest what changes participants should talk about.
Are there any changes you would make to_____ to make it easier to use?If we changed _____to_____, do you think that it would be easier to use?If there is a design change that the design team wants to explore specifically, ask the participants to react to it after they have made their initial suggestions.

(adapted from Dumas & Redish, 1999)

Some interviewers use the strategy of “playing dumb” to get more detail out of participants. By downplaying what you know, participants will be more explicit and may want to impress you with their knowledge. This may work in some cases, but if you slip and reveal in a question or probe that you know more than you are letting on, the participant can feel betrayed, duped, or patronized. This will clearly harm the interviewing relationship. As a result, we recommend being honest about what you know and understand while making it clear that your knowledge is limited and that the participant is there to increase your knowledge.

In that sense, adopting a beginner’s mind and asking naive questions will help you be more open to what your participants have to say and instill confidence in them that they are the experts, not you.

Watch Your Body Language

Your tone and body language can affect the way a participant perceives your questions. Be alert to your biases. Is there an answer to your question that you would like the participant to provide? Is there an answer you expect? Your expectations and preferences can be conveyed in your tone, body language, and the way you phrase questions, probes, and summaries. For example, looking bored or not making eye contact when you disagree or sitting on the edge of your seat and nodding vigorously when the participant is confirming your suspicions will clearly reveal your biases. Your biases are even conveyed in the responses that you do not follow up on. Watching yourself on video can help you identify those biases. If you are alert to your biases, you can better control them. You can also use your body language to “hack your body for better interviews” (see http://www.gv.com/lib/how-to-hack-your-body-language-for-better-interviews for an overview).

Know When to Move On

Knowing when to let go is as important as knowing when to follow up. A participant may not be as forthcoming as you would like, or maybe, the person is just plainly lying. As rare as that is, you should know how and when to move on. Remember: this is an interview, not an interrogation. Even if you suspect that the participant is not being completely honest, continued badgering is as rude as calling the participant a liar. Once it is clear that the participant cannot provide the details you are looking for, drop the line of questioning and move on. If necessary, you can throw out that participant’s data later. For ethical reasons, you must remember to treat the participant with respect, and part of that is knowing when to let a topic of discussion drop.

Reflecting

To verify that you understand what the participant has told you, it is essential to summarize, reword, or reflect the participant’s responses. You are not probing for more details but confirming the detail you already have. It is not necessary to do this after every response, especially if the response is brief and straightforward as in structured interviews. However, if the participant’s response has been lengthy, detailed, or not completely clear, you should summarize and restate what the participant has said to check for accuracy. A reflection of the earlier interview excerpt (see page 245) is provided below:

I just want to make sure that I have captured your experience correctly. You needed to purchase tickets for immediate travel, and you were looking for a bereavement discount. You couldn’t find information on bereavement discounts or an agent to assist you using the TravelMyWay app, so you used the TravelTravel app because you knew they had an agent on call. They were then able to get you the tickets at a discount. Does that summarize your experience correctly?

Reflections help build rapport by demonstrating that you were listening and understood the participant’s comments. They can also be used to search for more information. The participant may clarify any incorrect information or provide additional information, when responding to your summary.

At no time should you insert analysis into your summary or in response to a participant’s statement. In other words, do not try to provide a solution to a problem the participant has had and explanations for why the product behaved as it did or why you think the participant made a certain choice. In the example above, you would not want to inform the participant where she could have found the bereavement discount on your site. You are not a counselor, and you should not be defending your product. You are there to collect information from the interviewee—nothing more. Ask for observations, not predictions or hypotheses.

Empathy and Antagonism

When you are speaking with someone—even if it is someone you barely know—doesn’t it make you feel better to know that the other person understands how you feel? A skilled interviewer is able to empathize with the participant without introducing bias. Keep in mind that this is not a conversation in the traditional sense; you are not there to contribute your own thoughts and feelings to the discussion. In the earlier example, the interviewer could have shown empathy by stating: “That [bereavement] must have been a difficult time for you.” An inappropriate response would have been: “I know exactly how you feel. When my grandmother passed away, I had to pay an arm and a leg for my plane ticket.” The interview is not about you. You do not have to be a robot, devoid of emotion, in order to prevent bias, but also know what is appropriate and what is inappropriate. Make eye contact and use your body language to show the participant that you are engaged, that you understand what he or she is saying, and that you accept the participant regardless of what he or she has said.

Tip

Do not correct participants. Rather, seek to understand why they believe what they believe. Refocus any urge you may have to correct a participant into an opportunity to understand why he or she perceives things differently from you.

Transitions

Your questions or topics for discussion should transition smoothly from one topic to another. This will allow participants to continue on a track of thought, and the conversation will appear more natural. If you must change to a new topic of discussion and there is not a smooth transition, you can state: “That’s excellent information you’ve given me. While I make a note of it, can you think about how you would < introduce different topic > ?” This lets the participant know that he or she should not be looking for a connection or follow-up from the last question. If the participant believes that you are trying to probe more deeply into the previous topic, he or she may get confused or misinterpret your next question. A simple transition statement gives closure to the last topic and sets the stage for the next one.

Avoid negative connectors like “but” and “however.” These might signal to a participant that he or she has spoken too long or has said something incorrect. The person is likely to be more cautious when answering the following questions.

Monitoring the Relationship with the Interviewee

Like all user research activities, interviews are a giving and taking of information. Since it is one-on-one, the relationship is more personal. To get the most out of participants, it is important to monitor the relationship and treat the participant ethically. You want to make sure that the participant is comfortable, engaged, and trusting. If the participant does not feel you are being honest or is wondering what the motivation is behind your questions, he or she will be guarded and will not provide the full details you are looking for.

Watch the Participant’s Body Language

Does the participant seem tense, nervous, bored, or angry? Is he or she looking at the clock or is his or her attention lapsing? Do you feel tense? If so, the participant likely feels tense too. You may have jumped into the detailed, difficult, or sensitive questions before you established a good rapport. If possible, go back to easier questions, establish the purpose and motivations of the study, and be sure that the participant is a willing partner in the activity. If a particular line of questioning is the problem, it is best to abandon those questions and move on. A break may help. Sometimes, a participant is just having a bad day and nothing you say or do will help. At that point, ending the interview can be the kindest act possible for the both of you.

Suggested Resources for Additional Reading

Although you should pay attention to a person’s body language to determine whether he or she is tired, uncomfortable, or annoyed, we do not advocate ascribing meaning to it. Body language is ambiguous; a person may be staring at the floor because they are shy, rather than bored. It is more important to note changes in behaviors over time, rather than a singular action/behavior. To learn more about what people’s gestures and body language might mean, refer to the following:

 Ekman, P. (2007). Emotions revealed, second edition: Recognizing faces and feelings to improve communication and emotional life. Holt Paperbacks.

 Pease, B., & Pease, A. (2006). The definitive book of body language. Bantam.

Fighting for Control

If you find yourself competing with the participant for control of the interview, ask yourself why. Is the participant refusing to answer the questions you are asking or is he or she interrupting before you can complete your questions? Just as the interview is not about your thoughts or opinions, it is not up to the participant to ask the questions or drive the interview. At some point, the participant misunderstood the guidelines of the relationship. Begin with polite attempts to regain control, such as the following:

Because we have a limited amount of time and there are several topics that I would like to cover with you, I am going to need to limit the amount of time we can discuss each topic.

If the participant refuses to be a cooperative partner in the interviewing relationship and you do not feel you are obtaining useful information, simply let the participant go on and then write off the data. In extreme cases, it is best for all parties to end the interview early. If you have recorded the session, watch or listen to it with a colleague to see whether you can identify where the interviewing relationship went awry and how you can avoid it in the future. Think of this as a learning opportunity where you can practice and improve your conversational control skills.

Hold Your Opinions

Even though the interview is not about you, if the participant directly asks your opinion or asks you a question, you do not want to seem evasive because it could harm the rapport you have established. If you believe your response could bias the participant’s future responses, your reply should be straightforward:

Actually, I don’t want to bias your responses so I can’t discuss that right now. I really want to hear your honest thoughts. I would be happy to talk about my experiences after the interview.

If you are sure that the question and your response will not have an effect on the remainder of the interview, you can answer the question, but keep it brief.

Tip

If you look at a transcript of a bad interview, one of the first clues you might see is a one-to-one ratio of participant comments to interviewer comments/questions, especially in an unstructured interview. However, an abundance of participant comments does not necessarily indicate a successful interview. It is your job to keep the participant focused and not allow him or her to delve deeply into irrelevant topics or go beyond the scope of the study.

Dos and Don’ts

We have provided many recommendations about how to conduct a successful interview. It takes a lot of sessions and being in a lot of different scenarios to be a good interviewer. Do not get discouraged. For easy referral, some of the key tips are summarized in Figure 9.4.

f09-04-9780128002322
Figure 9.4 Dos and don’ts when conducting an interview.

Data Analysis and Interpretation

Depending on the purpose of and outlet for your interviews, either you can wait until you have conducted all the interviews before analyzing the data, or you can do preliminary analysis following each interview. For UX professionals, we recommend the latter because it can give you insights for future interviews. You may want to delve into more detail on questions or remove questions that are not providing value. And as any UX professional can attest to, stakeholders often ask for results before a study is complete. It can help if you have something more substantial to give them than just a few interesting quotes that stand out in your mind. On the other hand, for some academic studies, conducting all the interviews prior to data analysis may be more appropriate because it increases standardization across interviews, which is often necessary for scientific publication.

As with any activity, the longer you wait to get to the analysis, the less you will remember about the session. The notes will be more difficult to interpret, and you will have to rely heavily on the recordings. The more you have to rely on the recordings, the more time it will take you to analyze the data. Either way, hold a debrief session as soon as possible with your notetaker and any other observers to discuss what you learned. Review the recording to fill in any gaps or expand on ideas if necessary, and add any additional notes or quotes. If the session is still fresh in your mind, it will not take as long to review the recording.

Transcription

For some interview studies, you may want to have the audio recordings transcribed into text (e.g., if it is critical to have the exact words your participant uses recorded, if you want to publish your findings in some scientific venues). Transcription is typically done in one of three ways: verbatim, edited, or summarized. Verbatim transcripts capture a record of exactly what was done by both the interviewer and respondent including “ums,” “ahs,” and misstatements. For some types of analysis (e.g., linguistic analysis), these word crutches are important. When such detail is not necessary, you may choose an edited or summarized transcript. Edited transcripts typically do not include word crutches or misstatements. Summarized transcripts typically contain an edited and condensed version of the questions asked or topics raised by the interviewer since these are known in advance, along with the respondent’s comments.

Depending on whether you conducted a structured, semi-structured, or unstructured interview, you will have different types of data to analyze.

Structured Data

If you are conducting structured or semi-structured interviews, begin by tallying the responses to closed-ended questions. For example, how many people so far have selected each option in a multiple-choice question or what is the average rating given in a Likert scale question? Structured data from an interview are equivalent to survey data, with the exception that you may have to refer to recordings rather than data from paper or online forms. Chapter 10 has a detailed discussion of closed-ended question data analysis (see “Data Analysis and Interpretation” section, page 290).

Unstructured Data

Unstructured data are more time-consuming to analyze, in our experience so much so that it often goes unanalyzed. We recommend three strategies for analyzing unstructured data: categorizing and counting, affinity diagramming, and qualitative content/thematic analysis.

Categorizing and Counting

If you are conducting unstructured interviews, you can begin by identifying potential categories in the text as a whole. What is the range of responses you are getting? What is the most frequent response? Once you have identified categories, you can count the number of instances each category is represented in the data and organize this by participant or as represented in the interviews overall. After you have tallied the most frequent responses, select some illustrative quotes to represent each category of response.

Affinity Diagram

An affinity diagram is a method for analyzing interview data quickly. Similar findings or concepts are grouped together to identify themes or trends in the data. A detailed discussion of what an affinity diagram is, how to create one, and how to analyze the data from one is provided in Chapter 12 on page 340.

Qualitative Content/Thematic Analysis

Content analysis is a method of sorting, synthesizing, and organizing unstructured textual data from an interview or other source (e.g., survey, screen scrape of online forum). Content analysis may be done manually (by hand or software-assisted) or using software alone. When done manually (by hand or software-assisted), researchers must read and reread interviewee responses, generate a scheme to categorize the answers, develop rules for assigning responses to the categories, and ensure that responses are reliably assigned to categories. Because these rules are imperfect and interpretation of interviewee responses is difficult, researchers may categorize answers differently. Therefore, to ensure that answers are reliably assigned to categories, two or more researchers must code each interview and a predetermined rate of interrater reliability (e.g., Cohen’s kappa) must be achieved (see How to Calculate Interrater Reliability [Kappa] for details).

Several tools are available for purchase to help you analyze qualitative data. They range in purpose from quantitative, focusing on counting the frequency of specific words or content, to helping you look for patterns or trends in your data. If you would like to learn more about the available tools, what they do, and where to find them, refer to Chapter 8, “Qualitative Analysis Tools” section on page 208.

How to Calculate Interrater Reliability (Kappa)

To determine reliability, you need a measure of interrater reliability (IRR) or interrater agreement. Interrater reliability is the degree to which two or more observers assign the same rating, label, or category to an observation, behavior, or segment of text. In this case, we are interested in the amount of agreement or reliability between volunteers coding the same data points. High reliability indicates that individual biases were minimized and that another researcher using the same categories would likely come to the same conclusion that a segment of text would fit within the specified category.

The simplest form of interrater agreement or reliability is agreements/agreements+disagreementssi1_e. However, this measure of simple reliability does not take into account agreements due to chance. For example, if we had four categories, we would expect volunteers to agree 25% of the time simply due to chance. To account for agreements due to chance in our reliability analysis, we must use a variation. One common variation is Krippendorff’s Alpha (KALPHA), which does not suffer from some of the limitations of other methods (i.e., sample size, more than two coders, missing data). De Swert (2012) created a step-by-step manual for how to calculate KALPHA in SPSS. Another very common alternative is Cohen’s kappa, which should be used when you are analyzing nominal data and have two coders/volunteers.

To measure Cohen’s kappa, follow these steps:

Step 1. Organize data into a contingency table. Agreements between coders will increment the diagonal cells and disagreements will increment other cells. For example, since coder 1 and coder 2 agree that observation 1 goes in category A, tally one agreement in the cell A, A. The disagreement in observation 3, on the other hand, goes in cell A, B.

Raw data:

b09-01-9780128002322

Step 2. Compute the row, column, and overall sums for the table. The row, column, and total should be the same number.

Coder 2
ABC
A6219
Coder 1B19111
C11810
8121030

t0035

Step 3. Compute the total number of actual agreements by summing the diagonal cells:

agreement=6+9+8=23

si2_e

Step 4. Compute the expected agreement for each category. For example, to compute the expected agreement for “A,” “A”:

expectedagreement=rowtotal×columntotaloveralltotal

si3_e

Step 5. Sum the expected agreements due to chance:

expectedagreement=2.4=4.4=3.3=10.1

si4_e

Step 6. Compute Cohen’s kappa:

Kappa=agreementexpectedagreementtotalexpectedagreement

si5_e

Step 7. Compare Cohen’s kappa to benchmarks from Landis and Koch (1977):

< 0.00Poor
0.00-0.20Slight
0.21-0.40Fair
0.41-0.6Moderate
0.61-0.80Substantial
0.81-1.00Almost perfect

In our example, Cohen’s kappa of 0.746 falls within the “substantial” benchmark category. Generally, Cohen’s kappas of greater than 0.7 are considered acceptable.

Step 8. Resolve disagreements among coders by having coders discuss the reasons they disagreed and come to an agreement.

Communicating the Findings

In this section, we discuss some of the ways in which interview data can be effectively communicated. There are a few different ways that you can present the data. It all depends on the goals of your study and the method you feel best represents your data. In the end, a good report illuminates all the relevant data, provides a coherent story, and tells the stakeholders what to do next (see Table 9.6 for sample recommendations based on interview data).

Table 9.6

Sample table of recommendations

IssueRecommendation
Search team should work with the UI group to identify methods for improving the search engine and display of search results

 The most frequently stated difficulty participants had with our site was finding information using our search facility.

Nine out of ten participants asked for an on-call agent to help when they had a difficulty

 Conduct a usability evaluation of site to improve overall usability.

 Provide a FAQ based on the results of the interviews and questions we have received via e-mail to date.

 Investigate the feasibility of providing on-call agents.

When asked for additional services we can provide, half the users suggested being able to arrange transportation to and from the airport (taxi, hotel shuttle, etc.)

 Investigate the feasibility of purchasing shuttle tickets from our site.

 Ask associated hotels that provide shuttles to and from airports whether we can arrange this service online.

t0030

Over Time

Your interview may cover a period of time, such as asking a travel agent to describe his or her day from the first cup of coffee in the morning through turning off the computer monitor at the end of the day. Or you may ask someone a series of questions that cover the first six months of training on the job.

If your questions cover a period of time, it makes sense to analyze the data along a timeline. Use the results from your initial categorization to start filling in the timeline. In our travel agent example, what is the first thing that the majority of travel agents do in the morning? What are the activities that fill their days? Then, use individual anecdotes or details to fill in gaps and provide interesting information that does not fit neatly into a category. It can be those additional details that provide the most value.

By Topic

Your questions may not follow a timeline but simply address different topics related to a domain or user type. In these types of interviews, you may wish to analyze and present the data by question. Provide the range of answers for each question and the average response. Alternatively, you can group the data from multiple questions into larger categories and then discuss the results for each category. An affinity diagram is helpful in these types of interviews to identify higher-level categories, if they exist (see “Affinity Diagram” section, page 253).

By Participant

If each participant and his or her results are widely different, it can be difficult to categorize the data. This may be the result of difficulties with your user profile and recruiting, or it may be intentional in the case where you want to examine a variety of user types. It may make more sense to summarize the results per participant to illustrate those differences. Similarly, you may choose to analyze the data per company (customer), industry, or some other category membership.

Pulling It All Together

We began by discussing the best uses for interviews and the different types of interviews (structured, unstructured, and semi-structured). Proper wording of questions in interviews was then discussed in detail. If the questions in an interview are constructed poorly, the results of your study will be biased or will not provide the information you need. We also discussed how to conduct an interview, including the five phases of an interview, your role as an interviewer, and recording the data from the interviews. Finally, methods for analyzing the data and presenting your results were discussed.

The results from your interviews can be incorporated into documentation such as the Detailed Design Document. Ideally, additional user research techniques should be used along the way to capture new requirements and verify your current requirements. To summarize, below we list a few tips for conducting your next interview.

Interview Tips

 Begin with general questions and follow up with specific questions. Asking specific questions first may lead the participant into talking about only what he or she thinks you want to hear about.

 Unless using a structured interview, do not feel you need to ask every question on your list. Let the participant talk, and then, follow up with those that are most interesting for the purposes of your study.

 Videotape everything, if possible. Audiotapes are insufficient in situations where participants constantly refer to things with vague references. Be sure to get permission from the participant before any recording (audio or video).

 Involve your product team early and often. Encourage team members to accompany you and debrief after each visit with those who do. This will increase their buy-in to the need for research. Also, understanding what they are taking away from the interview will help you understand their priorities better.

 Understand your own notetaking skills in deciding whether or not to type or write. Taking notes on the laptop may save a step, but slow or inaccurate typing drags down the interview. Additionally, the second step of typing up notes is a great double-check.

 At the beginning of each recording, say the date, time, and participant number and remind the participant that you are recording the interview. For example, say, “Today is March 14th, 2015, and this is participant number 9. The red light is now on and the voice recorder is now recording.” Giving the date and participant number will help you keep track of your data and participants and help anyone who transcribes your audio data for you as well. Showing the participant a visual indicator of recording and reminding them that you are recording will provide clear information to your participant about when you are and are not recording the interview.

 Check and recheck your equipment. For voice recorders, is the light on, indicating that it is recording? Replay sections of the recordings during breaks. If you find that audio or video was not recorded, the conversation should be fresh enough in your mind that you can jot down a few notes to fill in the blanks.

Suggested Resources for Additional Reading

The books below are a detailed resource for anyone conducting interviews. The authors provide lots of sample interviews and indicate good and bad points in the interviewers’ techniques. We recommend them for anyone wanting a stand-alone resource for interviewing.

 Wilson, C. (2013). Interview techniques for UX practitioners: A user-centered design method. Morgan Kaufmann.

 Portigal, S. (2013). Interviewing users: How to uncover compelling insights. Rosenfield Media.

Case Study: Connecting Families: Importance of Including Children as Stakeholders

Lana Yarosh    Assistant Professor, University of Minnesota

Children and parents in modern families may spend significant periods of time living apart. According to the US Census, 30% of the children in the United States do not live in the same household with both of their parents, primarily due to divorce or separation of parents (Census, 2008). In another 15% of families, a child may spend temporary periods of a month or more living away from his or her parent (Census, 2006). Reasons for temporary separation include military deployment, travel for work, incarceration, and visitation travel after divorce. Continued meaningful contact with parents correlates strongly with positive outcomes for the child on measures of emotional, social, academic, and even physical well-being (Amato, 2000). Technology may be able to help parents achieve this meaningful contact even if they live apart from the child (Shefts, 2002). As a researcher in the domain of human-centered computing, I was interested in investigating the opportunities for novel communication technology to help separated families stay connected. To design for this context, I first sought to understand the needs of separated families. In this case study, I describe two interview studies with families, providing concrete examples of why it is important to include children in the interview process and providing strategies for interviewing young participants.

I conducted two in-depth semi-structured interview studies to understand the needs of separated families. In the first investigation, I interviewed five children (aged 7-14) and five residential and five nonresidential parents from divorced families (Yarosh, Chew, & Abowd, 2009). In the second study, I interviewed 14 pairs of parents and children (aged 7-13) from families separated by military deployment, business travel, or academic travel (Yarosh & Abowd, 2011). In this case study, I limit myself to providing evidence for why it is important to include children as stakeholders and specific advice for doing so successfully.

When seeking to understand families and children, it is critically important to include children in any user research activity. While it may be tempting to use parents as proxies for gauging the family’s needs, I have found that this is insufficient for understanding the complexities of the family dynamic. As I conducted interviews with parents and children, I was frequently surprised by their divergent motivations, goals, and descriptions of the same situation. Here, I highlight the importance of including children directly by providing three examples of nonconsensus or instances where parents thought that their children thought one thing and children reported a different thing, from my interview studies. In these cases, using the parent as a proxy would have led to an incomplete understanding of the context:

1. In the study of divorced families, the parents’ estimates of the child’s awareness of conflict frequently diverged significantly from the child’s account. The majority of the parents thought that their child was probably not aware of the competition over their time and affection between the parents. However, when asked about what was most difficult about staying close to both parents, children said things like the following:
Clearly, children’s perceptions of the situation were different than the adults’ models of their perceptions.

My mom has a way to make her voice sound like she doesn’t care, but at the same time, you know that it’s not true, and it really always hurts to hear that voice. And whenever I want to call my dad she always uses it saying, “Oh, so you’re calling him?”

2. In work-separated families, parents and children use conflicting strategies to manage temporary separation, particularly disagreeing on what constitutes an appropriate amount of communication with the remote parent. The need to increase contact is a characteristic common of traveling parents. However, spending more time communicating with the remote parent interfered with the strategies used by children to cope with the separation, which frequently focused more on contact with collocated family members rather than seeking remote contact. While this does not mean that designers should not build systems to encourage parent-child communication, it is important for designers to consider the obligation to communicate that their system may impose on the child and what may happen if the parent’s expectations for communication are not met. Again, excluding children from interviews would have led to an incomplete understanding of this context.

3. Lastly, in both studies, “the family” was too ambiguous to serve as a unit of analysis. When asked to list the people that they consider to be part of their family, many participants from the same household gave divergent responses. For example, younger children in divorced families typically included biological and stepparents; older children sometimes did not list stepparents; the parents themselves did not include the child’s other parent as part of their family. This “reflexive”—always referring back to the individual who is listing its members—definition of family is amplified in divorced families but is not unique to them. In intact families, in-laws are the classic example of persons that may be included in one member’s family but not in another’s. There are consequences for design when two individuals’ definitions of family do not overlap, such as privacy considerations. Without interviewing all the stakeholders, it can be easy to fail to consider these issues in design.

These three examples of nonconsensus in families show that interviewing all stakeholders, including children, is necessary to achieve a more nuanced understanding of the context.

While it is easy to agree that including children as stakeholders is important, actually doing so can be a challenge. Even school-age children are still developing the communication competencies to understand the finer points of language, such as abstraction or metaphor (Stafford, 2004). Many children may also feel uncomfortable speaking with an unfamiliar adult. However, I offer six specific strategies to help manage these challenges:

1. Working with children requires special considerations while preparing the protocol and assent documents. Children are not able to give informed consent, which will have to be obtained through their parents, but the assent procedure gives them the opportunity to understand their rights and what will happen in the interview. It is important to emphasize to the child that he or she can withdraw from the study, decline to answer any question, or take a break at any time (for more on informed consent and research ethics, refer to Chapter 3). In designing both the assent document and interview protocol, it is important to keep specific developmental milestones associated with age in mind. In my experience, while I have tried interviewing children as young as three, I have only started getting useful data with children around age six. However, you should take care to check the comprehension level of your protocol by piloting the protocol with friendly participants in the target age group and by using Flesch-Kincaid Grade Level test (available online and built into many word processors, including Microsoft Word). While this test has its limitations and should not be substituted for piloting, it can serve as the first step to point out questions where you may want to simplify wording or take extra care to ensure that the child understands the question. For more on working with special populations like children, see Chapter 2.

2. To encourage children to be open and honest, the researcher should actively work to equalize power between the child and the researcher. Children spend their lives in situations where adults expect the “right” answer from them. To encourage the child to share honest opinions and stories, the researcher needs to break through this power differential. There are a number of details to consider here: choose an interview setting where the child has power (e.g., playroom), dress like an older sibling rather than as a teacher, encourage use of first names, and let the child play with any technology that will be used (e.g., audio recorder) before starting the interview. Above all, emphasize that you are asking these questions because you do not yet know the answers, that the child is “the best at being a kid,” and that there is no wrong way to answer any of the questions.

3. Parents make the decisions in a study that concerns their children, which introduces a unique constraint in situations of nonconsensus between the parent and child. As a researcher, you will have to respect the parents’ decisions: you may not be able to interview the child separately and you cannot promise the child that anything will be kept private from the parents. However, you can explain to the parents why it is important for the child to have a chance to state his or her perspective in private. In my experience, most parents are willing to provide that private space, especially when the study is being conducted in their home where they worry less about the child’s comfort. Nonconsensus may introduce other ethical and methodological factors to consider, and I address these issues in more depth inA Yarosh (2014).

4. Children may struggle with abstraction, so ask for stories about specific situations. For example, instead of asking “How do you and your dad talk on the phone when he’s traveling?” ask “What did you tell your dad last time he called you?” It may take more questions to get at all the aspects you want to discuss, but it is much easier for children to discuss things they recently did rather than provide an overall reflection. This is most important with younger children but is a good place to start with any participant. This is most important with younger children but is a good place to start with any participant.

5. Additional effort may be necessary to engage a shy child, and one way to do so is to encourage the child to show-and-tell. For example, “Show me where you usually are when you think about your mom.” or “Show me some apps that you use with your dad on your phone.” Use the places and objects shared as stepping-stones to ask more nuanced questions about feelings, strategies, and preferences.

6. Lastly, incorporating drawing and design activities into the interview may help the child get into the “open-ended” nature of the study, be willing to be a little silly, and reveal what may be most important to him or her. For example, I asked children, “What might future kids have to help them stay in touch with their parents?” These drawings are not meant to produce actionable designs, but will reveal important issues through their presentation. Listen for key words (e.g., “secret” = importance of privacy), look for underlying concepts (e.g., “trampoline” or “swimming pool” = importance of physical activity), and attend to common themes such as who would be interacting with their future device, where, and how often. See Figure 9.5 for example drawings.

f09-05-9780128002322
Figure 9.5 Children’s drawings of magical objects to help them stay in touch with their nonresidential parents: (a) a magical door that lets the dad enter the child’s room to say good night; (b) a robot for carrying secret messages between a boy and his father; and (c) a system with speakers and a holographic projector that lets the parent and child speak whenever they want.

There are many different ways to successfully include children as participants; however, the strategies above have worked well for me in encouraging and empowering children. They may be a good starting point for researchers who are new to this space.

The main takeaway of this case study goes beyond work with separated families: when designing sociotechnical systems, make sure important stakeholders (such as children) are not left out of the process. It can be a challenge to interview children, but they frequently provide a different view than their parents. Speaking with the parent is not a replacement for speaking with the child. The two biggest tips for interviewing children to take away from this case study are (1) work to equalize power between children and adults and (2) keep questions as concrete as possible by asking the child to tell stories, show-and-tell, and draw. Keeping these strategies in mind will go a long way toward creating an atmosphere where children can make their point of view heard.

References

Alreck PL, Settle RB. The survey research handbook. 2nd ed. Burr Ridge, IL: Irwin Professional Publishing; 1995.

Amato PR. The consequences of divorce for adults and children. Journal of Marriage and the Family. 2000;62(4):1269–1287.

Boren MT, Ramey J. Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication. 2000;43:261–278.

Census US. Children living apart from parents—Characteristics of children under 18 and designated parents. 2006.

Census US. Household relationship and living arrangements of children under 18 years, by age and sex. 2008.

De Swert K. Calculating inter-coder reliability in media content analysis using Krippendorff’s Alpha. 2012 Available online: http://www.polcomm.org/wp-content/uploads/ICR01022012.pdf.

Dumas JS, Redish JC. A practical guide to usability testing. 2nd ed. Exeter, England: Intellect Books; 1999.

Green J, Thorogood N. Qualitative methods for health research. 2nd Thousand Oaks, CA: Sage; 2009.

Guest G, Bunce A, Johnson L. How many interviews are enough? An experiment with data saturation and variability. Field Methods. 2006;18:59–82.

Johnson T, Hougland J, Clayton R. Obtaining reports of sensitive behavior: A comparison of substance use reports from telephone and face-to-face interviews. Social Science Quarterly. 1989;70(1):173–183.

Krosnick JA. Survey research. Annual Review of Psychology. 1999;50:537–567.

Landis JR, Koch GG. The measurement of observer agreement for categorical. Biometrics. 1977;33:159–174.

Shefts KR. Virtual visitation: The next generation of options for parent-child communication. Family Law Quarterly. 2002;36(2):303–327.

Stafford M. Communication competencies and sociocultural priorities of middle childhood. In: Handbook of family communication. Mahwah, NJ: Lawrence Erlbaum Associates; 2004:311–332.

Yarosh S, Chew YC, Abowd GD. Supporting parent-child communication in divorced families. International Journal of Human-Computer Studies. 2009;67(2):192–203.

Yarosh S, Abowd GD. Mediated parent-child contact in work-separated families. In: ACM; 2011:1185–1194. Proc. of CHI..

Yarosh S. Conflict in families as an ethical and methodological consideration. In: Judge TK, Neustaedter C, eds. Evaluating and designing for domestic life: research methods for human-computer interaction. Springer Publishers; 2014.


entity “To view the full reference list for the book, click here

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.119.7