Index
Note: Page numbers followed by f indicate figures, b indicate boxes, and t indicate tables.
A
acknowledgment tokens 241
activity materials, card sort
primary data collected
310
in building personas
40–44
American Marketing Association 140–141
The Animal Picture Board:Kenya 57–59
and coffee evaluation scale
58,
58f
financial institutions/mechanisms
58
mobile transaction service
59
anonymity and confidentiality 73
product team/observer issues
187–189
B
behavioral versus attitudinal 103
in interviewer body language
246–247
Bizarre Bazaar:South Africa 59–60
“informance” or informative performance method
59
market research studies
361
of new recommendations
457
user research studies
361
budget/time constraints 99
buy-in for your activity 16–19
C
Cardiac Rhythm and Disease Management (CRDM) 443
welcoming participants
314
analysis with card sorting program
322
analysis with spreadsheet package
325
analysis with statistics package
324
data that computer programs cannot handle
325–326
duplicate card handling
325
individual/simultaneous card sort
306
information architecture
304
objects and definitions identification/creation
307–309
open versus closed sort
305
physical cards/computerized sort
305
additional data collected in card sort
310–311
identifying objects and definitions sorting
307–309
players involved in activity
311–313
remote (online)/in person sort
305–306
sample recruitment screener
132,
132f
things to be aware of when conducting
304–306
click stream, recorded 29–30
closed-ended questions 222
furthest neighbor method
322
nearest neighbor method
322
cognitive walk-throughs (CWs) 435
formative usability inspection
435
user group and audience
435
communication paths, from user to product team 15
community bulletin boards, participant recruitment advertisements on 139–140,
139f,
479
competitive analysis 32–34
competitor product evaluation
33
ensuring incorporation of findings
469–472
being virtual team member
470
ensuring product team documents findings
471
obtaining status for each recommendation
470
merging usability and development priorities
453–455
condensed ethnographic interview 385t,
397
handing out questionnaires at
479
confidential disclosure agreements (CDA)
participant refusal to sign
186–187
conflict of interest 75,
128
participant refusal to sign
186–187
master–apprentice relationship
393,
394
number of participants
394
relationships to avoid
394
interviewer–interviewee
395
qualitatively better insights
62
reduction in the time needed to get insights
62
account manager involvement
144
extra caution in dealing with
179b
recruitment of right people
145–146
customer support, logs of
D
daily information needs (DIN) studies 211–213
consistent ingredient of innovation
211
data-driven innovation process
217
technological limitations
215
datalogging software 92–93
data retention, documentation, and security 75–76
data, valid and reliable 74–75
“day-in-the-life” videos, creation 46
design concepts, focus groups 371b
discussion guide development
372
early research/brainstorming
375
usability testing methods
372
visual representation
373
monitor data collection
205
data analysis and interpretation
online diary study services/mobile apps
199–200
identifying type of study to conduct
202
recruiting participants
203
random/experiential sampling
202
structured and unstructured
194
double-barreled questions 228
dress code, for field studies 407
E
early adopters, feedback 35
educational materials 464,
469
e-mail addresses, purchasing 475
e-mail-based diary studies 196
ethical considerations 67–76
acknowledging true capabilities
75
anonymity and confidentiality
73
appropriate incentives
73–74
compensation for longitudinal or multipart studies
74
legal department, advice from
74
appropriate language
72–73
creating comfortable experience
72
data retention, documentation, and security
75–76
de-identifying data
75–76
beneficence and nonmaleficence
70–71
informed consent form
66,
68f
confidentiality of products
76–77
needs, remote studies
77,
78
policies versus laws versus ethics
66
informed consent form
66,
68f
Internal Revenue Service (IRS)
66
nondisclosure/confidentiality agreements (NDAs/CDAs)
66,
67f
protection of customer relationships
67
right to be informed
70–71
purpose of the activity
70
sponsor-blind study
70–71
valid and reliable data
74–75
misusing/misrepresenting work
75
user experience research community
383
ideal user experience
432
experiential sampling methodology (ESM) 202
retrospective think-aloud
438
F
face-to-face/mediated interview 224f
methods for presenting/organizing data
419,
420t
beginning data collection
413
recommendations for family/home visits
416b
summary of recommendations
415
data analysis and interpretation
415–418
analyzing Contextual Inquiry/Design data
418
analyzing Deep Hanging-Out data
417–418
key points to keep in mind
415
qualitative analysis tools
418
materials and equipment checklist
407–410
condensed ethnographic interview
385t,
397
observing while not present
385t,
399
observations versus inferences
413
diverse range of users and sites
402
setting company expectations up-front
402
developing your protocol
405
identifying type of study to conduct
400
players involved in activity
400–404
things to be aware of when conducting
381–382
gaining stakeholder support
381
tools for observation
413
videographer/photographer
404
where products could have been saved by
17t
action if team changes product mid-test
187–188
field study investigators
365
communicating findings
368
introducing activity and discussion rule
352
welcoming participants
352
data analysis and interpretation
363–368
analyzing qualitative data
363
removing difficult participant
370
management and planning tools
363
online community boards
347
sufficient domain knowledge
347
“day-in-the-life” video
355
online/phone-based focus groups
356–357
number and size of groups
346
overbearing participants
353
avoiding sensitive/personal topics
345
identifying questions wishing to answer
342
players involved in activity
346–349
transcription of session
349
formative evaluations 432
formative versus summative 104
G
guiding principles/antiprinciples
define it to measure it
45
H
discount usability engineering
434
high-fidelity software prototypes 446
holistic mobile insights 422b
data collection and analysis
423
early customer segments 427
multiple feedback channels
427
mobile usage environments
422
I
idealized interview flow 226t
for survey completion
268
information architecture 304
participant refusal to sign
186–187
intellectual property rights 32
recruitment to usability activities
145
international research 49
international user studies 48–49
localization/globalization
48–49
recruitment of participants
147–148
Internet Protocol (IP) address
Internet Service Provider (ISP) 29
inter rater reliability (IRR) 207,
403
select right types of probe
246
watch for generalities
243
watch participant’s body language
249–250
interviewer prestige bias 229
table of recommendations
256t
monitoring relationship with interviewee
249–251
data analysis and interpretation
252–254
participants, determining number of
234
choosing between telephone and in-person
223–224
identifying objectives of study
221
players involved in activity
234–235
selecting type of interview
222–224
things to be aware of when conducting
237–238
transcripts, of bad interviews
252b
K
knowledge-based screening 130
L
language, appropriate 72–73
capture of information in
28
M
mailing addresses, purchasing 475
marketing department, questions to ask 31
marketing research vendors, renting rooms from 85
do not act as participant
168
keep participants motivated and encouraged
168
no one should dominate
167
practice makes perfect
169
N
networking, to learn about product 26–27
nondisclosure agreement (NDA) 132,
145
Nondisclosure/confidentiality agreements (NDAs/CDAs) 66
notetaking for your activity 173f
O
objects and definitions identification/creation
inviting to user requirements activity
161–162
participants’ managers as
162
cognitive and physical capabilities
52
online diary study services 199–200
online/phone-based focus groups 356–357
open-ended questions (OEQs) 222
things to keep in mind
276
multiple groups, object
311
P
link from recruitment advertisement to
137
maintenance requirements
481
more extensive and expensive
480
recruiting agency issues
143
developing questionnaire for potential participants
475–476
sample questionnaire
476f
Personal Analytics Companion (PACO) 212–213
personally identifying information (PII) 282
sample for travel agent
42
things to be aware of when creating
41
avoid screening via e-mail
129
eliminate competitors
131
prepare response for people who do not match profile
132
request demographic information
130–131
work with the product team
129
for recruiting agency use
142
picture-in-picture (PIP) display 92
audio-visual equipment check
155
finding bugs or glitches
155
instructions and questions clarity check
155
quantitative analysis
106
for findings/recommendations presentation
460
preparing for user requirements activity 113–156
deciding duration and timing of session
125–126
individual activities
125
piloting your activity
See pilot
professional participants
151
presentation of findings/recommendations 455–463
ingredients of successful presentation
457–461
avoiding discussion of implementation or acceptance
463
keeping presentation focused
461–462
prioritizing and starting at top
462
starting with good stuff
462
presentation attendees
457
why verbal presentation essential
456–457
dealing with issues sooner rather than later
456
dealing with recommendation modifications
457
ensuring correct interpretation of findings and recommendations
457
merging usability and development priorities
453–455
customer support comments
27
early adopter feedback
35
with UCD processes incorporated
9–11
cushion deliverable dates
125
objectives, measures, and scope of study
117
separate, for each activity
117
Q
qualitative versus quantitative 104
quantitative/behavioral data
342
dos and don’ts in wording
233f
predicting the future
231
types of wording to avoid
232
asking sensitive questions
274
reducing initial question set
290
R
Rapid Iterative Testing and Evaluation (RITE) 438–439
recommendations report 468
recommendations, status of 470
recruitment of participants 126–150
creating a recruitment advertisement
135–138
cover key characteristics
136
don’t stress the incentive
136
include link to in-house participant database
137
set up automatic response
137
determining participant incentives
127–128
own company employees
128
of international participants
147–148
providing contact information
146
product team approval of participants
138
avoiding professional participants
143
and participant database
143
provision of screener to
142
recruiting technical users
141
reminding participants
143
supervisors and own employees
126
reflections, in interviews
248
executive summary section
466
comparison of methods
464
recommendations report
468
educational materials
469
card sorting initial grouping
486
card sorting read/rename
486
card sorting secondary grouping
486–487
travel card sort table, recommendations
487–488
product team’s perspective
12–15
research and participant roles 103,
103t
retrospective think-aloud 399
S
method to address situation
48
things to be aware of when creating
46
scheduling, holiday seasons 148
screen-capture software 91
semantic differential scale 57
semi-structured interview 253,
397
setting up research facilities 80–95
building a permanent facility
85–95
benefits of dedicated lab
85
datalogging software
92–93
screen-capture software
91
space for refreshments
94
surface to tape paper to
93–94
television/computer monitor
93
renting marketing or hotel facility
85
using existing facilities
82–84
for individual activities
83
separate room for observers
83–84
software life-cycle costs, most during maintenance phase 17t
by becoming virtual team member
470
arguments and counter arguments
16–19
by inviting to observe activities
161–162
by meeting to review proposal
124
by getting stakeholder involvement
469–470
by becoming virtual team member
19
by getting stakeholder involvement
19
stream-of-behavior chronicles 399
summative evaluations 432
logical sequence and groupings
283
communicating findings
295
confidentiality and anonymity
282
checklist for creation
269b
determining how data will be analyzed
280
identifying objectives of study
270
players involved in activity
270
probability versus nonprobability sampling
271–272
data analysis and interpretation
290–295
thinking about when formatting questions
272
incentives for completion
268
paper-based, creating
288,
289
purpose, informing participants of
281
response numbers per user type
268
things to be aware of when using
267–269
T
tax implications, of payment of articipants 151
television/computer monitor 93
time line, one-hour interview 238t
professional participant
151
tradeshows, handing out questionnaires at 479
transcription, of video/audio recording 174–175
TravelMyWay mobile app 165
U
unipolar/bipolar constructs 278
usability inspection methods 433–435
usability participation questionnaire 476f
versus user requirements gathering
11
user-centered design (UCD) 7–11
incorporation into product lifecycle
9–11
early focus on users and tasks
7–8
empirical measurement of product usage
user experience research method 96–112
behavioral versus attitudinal
103
budget/time constraints
99
business decisions and marketing
98
formative versus summative
104
graphic representation, participants
109,
109f
qualitative versus quantitative
104
representative sample
108
research and participant
103
social desirability bias
99
thesis/dissertation project
98
weighing resources
99,
99f
user interface designs 443
Attain Performa quadripolar 445
human factors scientist
446
interactive prototypes
445
medical device usability
444
optimal pacing configuration
444
product development life cycle
444
finding information to build
37
understanding types of user
38
benefits of addressing
16
impact on users criterion
451
merging usability and product development priorities
453–455
number of users criterion
451
user requirements activity conduct 158–189
dealing with awkward situations
179–189
extra caution required with customers
179b
participant confrontational with other participants
185
participant not truthful about her identity
186
participant refuses to be videotaped
185
participant refuses to sign CDA and consent forms
186–187
participant’s cell-phone rings continuously
181
participant thinks he is on job interview
183
product team/observers talk loudly
187–189
team changes product mid-test
187–188
wrong participant is recruited
182
dealing with late and absent participants
176–179
cushions to allow for late participants
177,
178
incentive policy for latecomers
178
including late participant
178
letter for latecomers
178f
introducing think-aloud protocol
169–171
combined video/audio and notetaking
176
domain knowledge of notetaker
173b
categorization into groupings
39
comparison of documents
36,
36t
V
marketing research, renting rooms from
85
W
wants and needs (W&N) analysis 357–358
link to questionnaire on
479