Index

Note: Page numbers followed by f indicate figures, b indicate boxes, and t indicate tables.

A

accessibility 50–51
account managers 144, 404
acknowledgment tokens 241
activity materials, card sort 
description 309
exercise, example cards 309, 310f
primary data collected 310
affinity diagrams 253, 417
in building personas 40–44
in field studies 417
in interviews 253
American Marketing Association 140–141
The Animal Picture Board:Kenya 57–59
and coffee evaluation scale 58, 58f
financial institutions/mechanisms 58
interesting patterns 59
mobile transaction service 59
rural Kenya 57, 58f
anonymity and confidentiality 73
anti-user 42
artifact analysis 419, 420t
artifact notebook 419
artifacts 397, 398–399
artifact walkthroughs 387, 398–399
awkward situations 
participant issues 181–187
product team/observer issues 187–189

B

behavioral versus attitudinal 103
benchmarking study 442
bias 381–382
in examples 242–243
interviewer prestige 229
in interviews 237
in interviewer body language 246–247
in wording of questions 228–229
non-responder 137
prestige response 237–238
response 267
self-selection 138
simplification 381–382
telescoping 231
translation 382
types 137–138
and unbiased probes 247t
biobserve 413
Bizarre Bazaar:South Africa 59–60
“bargaining” 60
“informance” or informative performance method 59
mobile banking 59
product/service 60
body language 
interviewer 246–247
learning about 250b
participant 249–250
brainstorming 357–358
in focus groups 357
ideal system 359
for interview questions 225–232
market research studies 361
moderator 358, 359
momentum 361
of new recommendations 457
prioritization 357, 359
stakeholders 342
for survey questions 279
user research studies 361
warm-up 361
W&N analysis 357
budget/time constraints 99
burnout 407
buy-in for your activity 16–19

C

caching, browser 29–30
Cardiac Rhythm and Disease Management (CRDM) 443
card sort 101, 302–337
activity materials 309
in activity proposal 120
conducting 313–318
activity timeline 313–314
card review and sorting 315–316
computerized card sort 317–318
labeling groups 317
practice 315
welcoming participants 314
data analysis and interpretation 318–327 See also cluster analysis
analysis with card sorting program 322
analysis with spreadsheet package 325
analysis with statistics package 324
cluster analysis 321–322
data that computer programs cannot handle 325–326
similarity matrix 320, 320f
simple summary 318–319
definition changes 327
deleting objects 326
duplicate card handling 325
group name handling 325
individual/simultaneous card sort 306
information architecture 304
interpreting the results 326–327
inviting observers 313
objects and definitions identification/creation 307–309
open versus closed sort 305
physical cards/computerized sort 305
players, activity 311–313
preparing to conduct 307–313
activity materials 309
additional data collected in card sort 310–311
identifying objects and definitions sorting 307–309
inviting observers 313
players involved in activity 311–313
remote (online)/in person sort 305–306
renaming objects 325
sample protocol 153, 155
sample recruitment screener 132, 132f
things to be aware of when conducting 304–306
writing objects 304
chairs 88
children 51–52
click stream, recorded 29–30
closed-ended questions 222
closed sort  See open sort
cluster analysis 321–322
amalgamation method 321–322
average linkage 322
complete linkage 322
furthest neighbor method 322
nearest neighbor method 322
single linkage 322
SynCaps 321, 323f
cognitive walk-throughs (CWs) 435
formative usability inspection 435
heuristic evaluation 435
issue ranges 435
notetaker 435
user group and audience 435
user interface design  See user interface designs
Cohen’s kappa 254b
communication paths, from user to product team 15
community bulletin boards, participant recruitment advertisements on 139–140, 139f, 479
competitive analysis 32–34
comparison grid 34
competitor product evaluation 33
traditional 32
competitors 
identification 32
learning from 32–34
primary 32–33
secondary 32–33
computer and monitors 91
concluding final 447–472
ensuring incorporation of findings 469–472
being virtual team member 470
ensuring product team documents findings 471
keeping scorecard 471–472
obtaining status for each recommendation 470
stakeholder involvement 469–470
prioritization of findings 450–455 See also user requirements, prioritization
merging usability and development priorities 453–455
usability perspective 451–452
reporting findings  See report of findings
condensed ethnographic interview 385t, 397
conferences 
handing out questionnaires at 479
confidential disclosure agreements (CDA) 
participant refusal to sign 186–187
signature by guardian 148–149
conflict of interest 75, 128
confound 103–104
consent form 412
participant refusal to sign 186–187
content writers 26
contextual design 418
contextual inquiry (CI) 385t, 393–395
analyzing data 418
context 393, 394
focus 393, 395
interpretation 393, 395
master–apprentice relationship 393, 394
number of participants 394
partnership 393, 394–395
relationships to avoid 394
expert–novice 394
guest–host 395
interviewer–interviewee 395
continuers 241
control room 89, 92
convenience sample 108, 110t
conventional methods 
actionable insights 62
qualitatively better insights 62
reduction in the time needed to get insights 62
Cooper, Alan 41
copyright laws 32
cost–benefit chart 455
quadrants 455
high value 455
luxuries 455
strategic 455
targeted 455
cost/feasibility analysis 106–107
cost/ROI analysis 107
Craig’s List 139–140
crowdsource 207
customer contacts 144–146
in usability activities 144–146
account manager involvement 144
angry customers 144–145
extra caution in dealing with 179b
internal employees 145
recruitment of right people 145–146
time to recruit 145
unique customers 145
customer support, logs of 
questions/problems 27

D

daily information needs (DIN) studies 211–213
analysis 216
consistent ingredient of innovation 211
data-driven innovation process 217
experiencing self 213
findings 216
at Google 210–211
incentive structure 215
limitations 215
overview 211–212
participants 212
pilot studies 216
remembering self 213
technological limitations 215
tools 212–213
datalogging software 92–93
products available 92
data retention, documentation, and security 75–76
data, valid and reliable 74–75
“day-in-the-life” videos, creation 46
Deep Hanging-Out 385t, 389–392
analyzing data 417–418
focal points 389, 390t
tips 392b
dendrogram 321, 321f, 322
design concepts, focus groups 371b
discussion guide development 372
early research/brainstorming 375
facilitation 375–376
housekeeping 373
recruitment 372
retirement planning 372
trends and patterns 374
usability testing methods 372
user interface 373–374
visual representation 373
design thinking 11
desirability testing 439–440
diary studies 190–217
aware of when conducting 194–195
communicating findings 208–209
conducting 
monitor data collection 205
training participants 204–205
data analysis and interpretation 
affinity diagram 206
crowdsourcing 207
data cleaning 205–206
qualitative analysis tools 206–207
quantitative analysis 207–208
data and flexibility 195
description 194
e-mail 196
end of day 200
and field studies 194
incident diaries 200–201
intervals 201–202
online diary study services/mobile apps 199–200
paper 195–196
preparing to conduct 
diary materials 203
identifying type of study to conduct 202
incentives 204
length and frequency 203–204
recruiting participants 203
random/experiential sampling 202
SMS (text-message) 198–199
social media 199
structured and unstructured 194
video 197–198
voice 197
dispersion 293
double-barreled questions 228
double negatives 228
dress code, for field studies 407
duration of your activity 125–126

E

early adopters, feedback 35
educational materials 464, 469
e-mail addresses, purchasing 475
e-mail-based diary studies 196
equipment cart 93
ethical considerations 67–76
acknowledging true capabilities 75
within boundaries 75
delegating work 75
training and tools 75
anonymity and confidentiality 73
appropriate incentives 73–74
accountable 74
compensation for longitudinal or multipart studies 74
legal department, advice from 74
appropriate language 72–73
creating comfortable experience 72
data retention, documentation, and security 75–76
de-identifying data 75–76
original data 75–76
reports 76
debrief 76
do no harm 70
beneficence and nonmaleficence 70–71
risks 70–71
informed consent form 66, 68f
legal considerations 72
basic rules of ethics 76
confidentiality of products 76–77
needs, remote studies 77, 78
sample NDA 77
permission to record 71
consent form 71
“opt in” to recording 71
policies versus laws versus ethics 66
informed consent form 66, 68f
Internal Revenue Service (IRS) 66
nondisclosure/confidentiality agreements (NDAs/CDAs) 66, 67f
policies 66
protection of customer relationships 67
right to be informed 70–71
brand-blind study 70–71
deception 70–71
misperception 71
purpose of the activity 70
sponsor-blind study 70–71
right to withdraw 73
triad of 69f
valid and reliable data 74–75
conflict of interest 75
limitations 74
misusing/misrepresenting work 75
ethnographic study 380
anthropology 384
qualitative research 383
software industry 383
user experience research community 383
evaluation apprehension 353–354
evaluation methods 430–446
bias 433
communication 443
completion rate 442
ideal user experience 432
live experiments 441
product development 433
representative tasks 433
summative evaluation 442
task time 442
usability testing 436–440
executive summary report 466, 468–469
experiential sampling methodology (ESM) 202
expert reviews 103
eye tracking 437–438
desktop tracks 437, 437f
heat map 437, 438f
mobile tracks 437f
retrospective think-aloud 438
usability issues 438
EZCalc 487

F

face-to-face/mediated interview 224f
feasibility analysis 107
field research 380
field studies 101, 378–428
account managers 404
advantages 384, 385t
communicating findings 418–419
artifact notebook 419
methods for presenting/organizing data 419, 420t
storyboards 419
conducting 411–415
beginning data collection 413
getting organized 412
meeting participant 412–413
organizing data 414
recommendations for family/home visits 416b
summary of recommendations 415
wrap-up 414
data analysis and interpretation 415–418
affinity diagram 417
analyzing Contextual Inquiry/Design data 418
analyzing Deep Hanging-Out data 417–418
key points to keep in mind 415
qualitative analysis tools 418
dress code for 407
home/family visits 416b
investigators 402–403
lessons learned 420–421
missing users 421
surprise guests 420–421
level of effort 384, 385t
logistics 382
materials and equipment checklist 407–410
artifact walkthroughs 385t, 398–399
comparison chart 384, 385t
condensed ethnographic interview 385t, 397
ethnography 383b
incident diaries 385t, 398f
interacting with user 385t, 392–397
method supplements 385t, 397–399
observation only 387–392
observing while not present 385t, 399
process analysis 385t
pure observation 387–388, 392
notetaker 403
observations versus inferences 413
participants 401–402
diverse range of users and sites 402
meeting 412–413
number of 402
setting company expectations up-front 402
preparing for 400–411
activity materials 407–410
checklist 407–410
developing your protocol 405
identifying type of study to conduct 400
players involved in activity 400–404
scheduling visits 405–407
thank-you notes 414, 416
training players 404–405
things to be aware of when conducting 381–382
gaining stakeholder support 381
Hawthorne effect 382
logistics challenge 382
types of bias 381–382
tools for observation 413
videographer/photographer 404
where products could have been saved by 17t
firewall 33–34
focus groups 17t, 101, 102b, 338–376
action if team changes product mid-test 187–188
affinity diagram 364–365
cards 366
design/product ideas 365
field study investigators 365
recommendation 365
space finding 365
stakeholders 364
team assembling 365
TravelMyWay app 367–368, 368f
communicating findings 368
conducting 351–353
activity timeline 351
focus group discussion 352–353
introducing activity and discussion rule 352
round-robin format 353
welcoming participants 352
data analysis and interpretation 363–368
affinity diagramming 363–368
analyzing qualitative data 363
debriefing 363
researchers 364
design concepts 371b
group interactions 352–353
versus interviews 340
iterative 355–356
lessons learned 368–370
mixing user types 369–370
moderator 368
removing difficult participant 370
management and planning tools 363
moderator 340, 347
checklist for 347, 348f
online community boards 347
planned path 347
sufficient domain knowledge 347
modifications 353–360
brainstorming 357–358
data analysis phase 360
“day-in-the-life” video 355
focus troupe 356
individual activities 353–354
iterative focus groups 355–356
online/phone-based focus groups 356–357
prioritization phase 359–360
task-based focus groups 354–355
notetaker 348–349
domain knowledge 348
stakeholders 348–349
transcription 349
participants 346
group dynamics 346
mix of 347, 369–370
number and size of groups 346
overbearing participants 353
preparing to conduct 340–351
activity materials 350–351
avoiding predictions 345
avoiding sensitive/personal topics 345
behavioral questions 342
identifying questions wishing to answer 342
inviting observers 350
moderator 341
players involved in activity 346–349
topic/discussion guides 341–346
task-based 354–355
transcription of session 349
videographer 349
formative evaluations 432
formative versus summative 104
free-listing 308
number of objects 308
participants 308
pilot session 309
users’ terminology 308

G

groupthink 353–354
guiding principles/antiprinciples 
brainstorm 45
define it to measure it 45
evaluations 45
repeat 45

H

Hawthorne effect 382
Help Desk 27
heuristic evaluations 9, 434–435
discount usability engineering 434
good user experience 434
help/documentation 435
heuristics 107, 108t
high-fidelity software prototypes 446
holistic mobile insights 422b
convenience 424
data collection and analysis 423
diary phase 427
early customer segments 427
financial activities 426
financial complexity 424
in-home interviews 423
lessons learned 427–428
diary feedback 427–428
multiple feedback channels 427
mobile experiences 427
mobile roadmap 426
mobile usage environments 422
reinvigorate 425
stakeholders 425
home/family visits 416b
honesty, of interviewees 237–238

I

ID checks 151, 186
idealized interview flow 226t
incentives 118
appropriate 73–74
for survey completion 268
incident diaries 200–201, 385t, 398f
incorporation of findings 469–472
Inferential statistics 294–295
information architecture 304
card sort to inform 304
informed consent form 66, 68f, 412
participant refusal to sign 186–187
insight sheets 419, 420t
inspection methods 102
intellectual property rights 32
internal employees 145
recruitment to usability activities 145
international research 49
international user studies 48–49
cultural issues 49
localization/globalization 48–49
recruitment of participants 147–148
resources on preparing for 148–149
Internet Protocol (IP) address 
temporary assignment 29
Internet Service Provider (ISP) 29
inter rater reliability (IRR) 207, 403
interviewer 234–235
requirements 234–235
rules for 240–249
asking the tough questions 242–243
do not force choices 244
empathy and not antagonism 248–249
fighting for control 250
hold your opinions 250–251
keep on track 240–241
know when to move on 247–248
reflecting 248
remain attentive 241–242
select right types of probe 246
silence is golden 241
transitions 249
using examples 243
watch for generalities 243
watch for markers 245–246
watch participant’s body language 249–250
watch your body language 246–247
interviewer prestige bias 229
interviews 100, 102b, 218–262, 346, 352–353, 367, 397
advantages 223
body of session 240
communicating findings 256–257
over time 256
by participant 257
table of recommendations 256t
by topic 256
conducting 236–251
dos and don’ts 251
monitoring relationship with interviewee 249–251
multiple 242b
phases 238–240
role of interviewer  See Interviewer, rules for
cooling-off phase 240
data analysis and interpretation 252–254
affinity diagram 253
categorizing 253
qualitative analysis tools 253–254
versus focus groups 340
introduction phase 238–240
sample introduction 239
maximum session time 240
notetaker 235
participants, determining number of 234
preparing to conduct 220–236
activity materials 236
choosing between telephone and in-person 223–224
identifying objectives of study 221
inviting observers 235–236
players involved in activity 234–235
selecting type of interview 222–224
testing questions 233
writing questions  See Questions
pros and cons 223t
things to be aware of when conducting 237–238
bias 237
honesty 237–238
transcripts, of bad interviews 252b
types 397
semi-structured 397
warm-up phase 239–240
wrap-up phase 240 See also Interviewer

K

knowledge-based screening 130

L

lab versus contextual 103–104
language, appropriate 72–73
leading questions 228
lighting, adjustable 90
Likert scale 253, 278
listening, active 167
loaded questions 229
log files 
analysis of time data 30
capture of information in 28
limitations 29

M

mailing addresses, purchasing 475
markers 
watching for 245–246
marketing department, questions to ask 31
marketing research vendors, renting rooms from 85
memory biases 110t
mental models 8
microphones 90
mirror, one-way 88–89
mixer 92
moderating activities 
domain knowledge 166b
importance 188
rules 
ask questions 167
do not act as participant 168
have personality 169
keep activity moving 168
keep participants motivated and encouraged 168
no critiquing 168–169
no one should dominate 167
practice makes perfect 169
stay focused 167–168

N

negative users 42
networking, to learn about product 26–27
nondisclosure agreement (NDA) 132, 145
Nondisclosure/confidentiality agreements (NDAs/CDAs) 66
nonprobability sampling 271–272
non-responder bias 137
notetaking for your activity 173f
number of users 104–109

O

objects and definitions identification/creation 
concept 308
existing product 307
free-listing 308
observation guide 393, 393f, 395, 409
observations 103, 103t
observers 
food for 162b
in interviews 235–236
inviting to user requirements activity 161–162
participants’ managers as 162
separate room for 175
older users 52–53
cognitive and physical capabilities 52
cognitive status 53
health status 53
one-way mirror 88–89
online diary study services 199–200
online/phone-based focus groups 356–357
open-ended questions (OEQs) 222
things to keep in mind 276
use 276
open sort 
definition 311
multiple groups, object 311
new object 311
object deletion 310–311
objects rename 311
outlier 30

P

page views 30
paper-based diary studies 195–196
paradata analysis 295
participant database 
on the cheap 480
link from recruitment advertisement to 137
maintenance requirements 481
more extensive and expensive 480
recruiting agency issues 143
in recruitment 140–141
requirements for creating 473–481
developing questionnaire for potential participants 475–476
distributing questionnaire 476–479
sample questionnaire 476f
technical 479–481
in tracking participants 150–152
Personal Analytics Companion (PACO) 212–213
personally identifying information (PII) 282
personas 40–44
anti-user 42
benefits 41
components 42
content 36t
creating 42
definition 36t
multiple 41
primary 42
purpose 36t
sample for travel agent 42
secondary 42
tertiary 42
things to be aware of when creating 41
phone screener 129–132, 142
development tips 129–132
avoid screening via e-mail 129
eliminate competitors 131
keep it short 129
prepare response for people who do not match profile 132
provide important details 131–132
request demographic information 130–131
use test questions 130
work with the product team 129
for recruiting agency use 142
sample 132, 132f
picture-in-picture (PIP) display 92
pilot 155–156
attendees 156
benefits 
audio-visual equipment check 155
finding bugs or glitches 155
instructions and questions clarity check 155
practice 155
timing check 155
importance 155
players, activity 
facilitator 312–313
in-person card sort 311
participants 312
user mental models 312
user profile 312
videographer 313
posters 460, 464, 469
power analysis 105–106
calculation 105
quantitative analysis 106
quantitative studies 105
sample size 105
PowerPoint 
for findings/recommendations presentation 460
preparing for user requirements activity 113–156
creating a proposal  See proposal, user requirements activity
creating a protocol 153
deciding duration and timing of session 125–126
group sessions 125
individual activities 125
piloting your activity  See pilot
recruiting participants  See recruitment of participants
tracking participants 150–152
professional participants 151
tax implications 151
watch list 151–152
presentation of findings/recommendations 455–463
ingredients of successful presentation 457–461
avoiding discussion of implementation or acceptance 463
avoiding jargon 463
delivery medium 459
keeping presentation focused 461–462
prioritizing and starting at top 462
starting with good stuff 462
using visuals 459
presentation attendees 457
why verbal presentation essential 456–457
dealing with issues sooner rather than later 456
dealing with recommendation modifications 457
ensuring correct interpretation of findings and recommendations 457
prestige response bias 237–238
primary users 38, 42
prioritization of findings 450–455
merging usability and development priorities 453–455
two-stage process 450, 451f
usability perspective 451–452
probes 246
close-ended 246
open-ended 246
biased and unbiased 246, 247t
process analysis 385t, 396–397
product 
learning about 25–35
competitors 32–34
customer support comments 27
early adopter feedback 35
importance 25
log files 28–30
marketing department 31
networking 26–27
using product 26
product lifecycle 
with UCD processes incorporated 9–11
concept 9
design 9
develop 10
release 10–11
proposal, user requirements activity 117–119, 120, 120f, 124–125
cushion deliverable dates 125
getting commitment 124–125
meeting to review 124
need to create 117
sample 120, 120f, 124–125
sections 117–119
history 117
incentives 118
method 117
objectives, measures, and scope of study 117
preparation timeline 119, 119t
proposed schedule 118–119
recruitment 118
responsibilities 118–119
user profile 118
separate, for each activity 117
protocol 153, 405
creation 153
for field study 418–419
importance 153
sample 153
proxy server 29
punctuality 148
purposive sampling 109

Q

qualitative analysis tools 206–207, 254, 418
qualitative content 253–254
qualitative data 363
analyzing 363
qualitative versus quantitative 104
quantitative data 342
questionnaires 
for potential participants 476–479
questions 342
closed-ended 354
data analysis 222, 225
focus group 342–343
ordering 345–346
participants 343
quantitative/behavioral data 342
ranking/polling 354
testing 345
types 345
wording 341
interviews 
bias avoidance 236
brevity 227
clarity 228
depending on memory 231–232
dos and don’ts in wording 233f
double-barreled 228
double negatives in 228
inaccessible topics 231
leading 228
loaded 229
predicting the future 231
testing 233
threatening 232f
types of wording to avoid 232
vague 228
open-ended 342
survey 
asking sensitive questions 274
format and wording 275–279
keeping survey short 273–274
reducing initial question set 290

R

random digit dialing (RDD) 271–272, 284
Rapid Iterative Testing and Evaluation (RITE) 438–439
Rasas (India) 54–56
emotion map 56, 56f
“emotion tickets” 54–56
recommendations report 468
recommendations, status of 470
recording your activity 171–176
record, permission to 71
consent form 71
“opt in” to recording 71
recruitment of participants 126–150
in activity proposal 118
creating a recruitment advertisement 135–138
be aware of types of bias 137–138
cover key characteristics 136
don’t stress the incentive 136
include link to in-house participant database 137
include logistics 136
provide details 136
sample posting 139–146
set up automatic response 137
state how to respond 137
crowd sourcing 149–150
determining participant incentives 127–128
customers 128
generic users 127–128
own company employees 128
students 128
developing a recruiting screener  See phone screener
of international participants 147–148
methods 139–146
community bulletin boards 139–140
in-house database 140
recruiting agencies use of customer contacts  See customer contacts
online services 149
preventing no-shows 146–147
over-recruiting 147
providing contact information 146
reminding participants 146–147
product team approval of participants 138
recruiting agencies 140–143
avoiding professional participants 143
benefits of using 140–143
charges 141
completed screeners 143
no-show rates 141
notice to recruit 141
and participant database 143
provision of screener to 142
recruiting technical users 141
reminding participants 143
of special populations 148–149
escorts 148–149
facilities 149
transportation 148
supervisors and own employees 126
reflections, in interviews 248
remote testing 440
report of findings 463–469
complete report 464–468
appendices 468
archival value 465
background section 466
conclusion section 468
executive summary section 466
key sections 466–468
method section 466
results section 467
template  See report template
value 465
executive summary report 468–469
format 464
comparison of methods 464
recommendations report 468
report supplements 469
educational materials 469
posters 460, 464, 469
usability test 467
report template 465, 482–490
card sort 483
card sorting initial grouping 486
card sorting read/rename 486
card sorting secondary grouping 486–487
executive summary 484
materials 485–487
participants 485
procedure 486–487
travel card sort table, recommendations 487–488
requirements  See also user requirements
business 14
marketing 14–15
product team’s perspective 12–15
sales 14–15
research and participant roles 103, 103t
response bias 267
retrospective think-aloud 399

S

sample report template 485, 485f
sample size 103t, 105b, 106b, 108
sampling plans 388
saturation 106
scenarios 46–48
benefits 46
components 42
content 36t
creating 46–48
definition 36t
examples 
simple 47
purpose 36t
template 42, 48
execution path 48
method to address situation 48
situation/task 47
things to be aware of when creating 46
title 47
scheduling, holiday seasons 148
scorecard, usability 471–472
screen-capture software 91
secondary users 38, 42
self-report 103
self-selection bias 138
semantic differential scale 57
semi-structured interview 253, 397
setting up research facilities 80–95
building a permanent facility 85–95
adjustable lighting 90
benefits of dedicated lab 85
chairs 88
computer and monitors 91
couch 88
datalogging software 92–93
equipment cart 93
lab layout 87–95
microphones 90
mixer 92
one-way mirror 88–89
screen-capture software 91
sound-proofing 91–92
space for refreshments 94
storage space 94–95
surface to tape paper to 93–94
tables 88
technology changes 87
television/computer monitor 93
video cameras 90
whiteboards 93–94
disruption prevention 82
renting marketing or hotel facility 85
using existing facilities 82–84
arrangement of room 83
for individual activities 83
separate room for observers 83–84
videographer 84
video-conferencing 83–84
significant event 243
similarity matrix 320
simplification bias 381–382
site visits 17t, 380
SMS (text-message) 198–199
snowball sample 109
social desirability 99, 110t, 237, 288
social loafing 356
social media 140, 199
software life-cycle costs, most during maintenance phase 17t
sound-proofing 91–92
Spectator 413
Spectator Go! 413
stakeholder input 98
stakeholders 381, 410
by becoming virtual team member 470
field studies 381, 410
getting buy-in for activity 16–19, 124–125
arguments and counter arguments 16–19
by inviting to observe activities 161–162
by meeting to review proposal 124
by getting stakeholder involvement 469–470
preventing resistance 19
by becoming virtual team member 19
by getting stakeholder involvement 19
stimulated recall 399
storyboards 419
stream-of-behavior chronicles 399
summative evaluations 432
surrogate products 32
acquiescence bias 269
branching in 286, 286f
common structure elements 280–283
clutter reduction 282
consistency 288
font selection 283
logical sequence and groupings 283
communicating findings 295
visual presentation 295
complex 266–267
confidentiality and anonymity 282
contact information 281
creating 269–290
building survey 269, 280–283
checklist for creation 269b
composing questions  See Questions, survey
determining how data will be analyzed 280
e-mail surveys 285–288
identifying objectives of study 270
number of respondents 270–271
paper-based surveys 288, 289
players involved in activity 270
preparation timeline 270
probability versus nonprobability sampling 271–272
web-based surveys 285–288
data analysis and interpretation 290–295
initial assessment 290–291
thinking about when formatting questions 272
types of calculation 291–295
e-mail, creating 285–288
exit 287–288
versus focus groups 343–344, 353–354
incentives for completion 268
instructions 281
intercept 267
versus interviews 
lessons learned 295
nonresponse bias 268
paper-based, creating 288, 289
piloting 289–290
progress indicators 287–288
proposal 270
purpose, informing participants of 281
response numbers per user type 268
response rates 285
satisficing 268–269
selection bias 267
things to be aware of when using 267–269
time to complete 281–282
title 281
vendors 271–272
web-based, creating 285–288
when to use 266–267
synergy 362

T

tables 88
table set-up for 84f
task flowcharts 419, 420t
task hierarchies 419, 420t
tax implications, of payment of articipants 151
telescoping 231
television/computer monitor 93
tertiary users 38, 42
thematic analysis 253–254
think-aloud 403
data 403
protocol 169–171
retrospective 399
time line, one-hour interview 238t
timing of your activity 125–126
tracking participants 150–152
professional participant 151
tax implications 151
watch list 151–152
tradeshows, handing out questionnaires at 479
training 
courses 
on your product 27
transfer of 17t
transcription, of video/audio recording 174–175
transfer of training 17t
translation bias 382
TravelMyWay mobile app 165
triangulation 100

U

unipolar/bipolar constructs 278
usability inspection methods 433–435
CWs 435
leverage expertness 433–434
product development cycle 433–434
usability participation questionnaire 476f
usability scorecard 471–472
usability testing 354, 363, 372, 436–440
Café study 439
ecological validity 439
end users attempts 436
eye tracking 437–438
investments 436
lab testing 436
remote testing 440
RITE 438–439
user performance 436
versus user requirements gathering 11
user-centered design (UCD) 7–11
incorporation into product lifecycle 9–11
philosophy 7
principles of 7–9
early focus on users and tasks 7–8
empirical measurement of product usage 8
iterative design 8–9
User Experience (UX) 4
user experience research method 96–112
advantages and disadvantages 109–112, 110t
behavioral versus attitudinal 103
budget/time constraints 99
business decisions and marketing 98
card sort 101
convenience sample 108
cost/feasibility analysis 106–107
diary studies 100
differences 102–109, 103t
evaluation methods 102
field study 101
focus group 101, 102b
formative versus summative 104
graphic representation, participants 109, 109f
guidelines 109
heuristics 107, 108t
interviews 100, 102b
lab versus contextual 103–104
number of users 104–109
power analysis  See Power analysis
purposive sample 109
qualitative versus quantitative 104
representative sample 108
research and participant 103
sample 108
saturation 106
snowball sample 109
social desirability bias 99
surveys 100, 102b
thesis/dissertation project 98
triangulate 100
weighing resources 99, 99f
user interface designs 443
Attain Performa quadripolar 445
CRDM 443
human factors scientist 446
implants 445
interactive prototypes 445
medical device usability 444
optimal pacing configuration 444
product development life cycle 444
resynchronization 444
VectorExpressTM 444
user profiles 37–40, 312
in activity proposal 118
content 36t
creating 38–40
definition 36t
finding information to build 37
purpose 36t
understanding types of user 38
user requirements 15–16
benefits of addressing 16
definition 7
prioritization 451–452 See also cost–benefit chart
high priority 452
impact on users criterion 451
low priority 452
medium priority 452
merging usability and product development priorities 453–455
number of users criterion 451
questions to help with 453, 454f
usability perspective 453, 454f
user requirements activity conduct 158–189
dealing with awkward situations 179–189
extra caution required with customers 179b
participant confrontational with other participants 185
participant not truthful about her identity 186
participant refuses to be videotaped 185
participant refuses to sign CDA and consent forms 186–187
participant’s cell-phone rings continuously 181
participant thinks he is on job interview 183
product team/observers talk loudly 187–189
team changes product mid-test 187–188
wrong participant is recruited 182
dealing with late and absent participants 176–179
cushions to allow for late participants 177, 178
incentive policy for latecomers 178
including late participant 178
letter for latecomers 178f
no-shows 179
introducing think-aloud protocol 169–171
inviting observers 161–162
moderating activity  See Moderating activities
recording and notetaking 171–176
combined video/audio and notetaking 176
domain knowledge of notetaker 173b
sample of notes 174f
sample shorthand 172f
taking notes 176
video/audio recording 174–175
video tips 175b
warm-up exercises 165
welcoming participants 163–165
user research techniques 109–112, 110t
users 
anti-user 42
categorization into groupings 39
characteristics 38
learning about 25–35 See also personas; scenarios; user profiles
comparison of documents 36, 36t
importance 35–48
iterative process 36f
primary 38, 42
secondary 38
tertiary 38, 42
user/task matrix 419, 420t

V

Vague questions 228
vendors 
marketing research, renting rooms from 85
video cameras 90
video diary study 197–198
videographer 235
video recording 
tips 175b
visit summary template 409, 409f

W

wants and needs (W&N) analysis 357–358
ideal system 357
information 357
task 357
warm-up exercises 165
watch list 151–152, 183, 184
websites 
link to questionnaire on 479
WebTrends 30
welcoming participants 163–165, 314, 352
whiteboards 93–94
withdraw, right to 73
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.134.23