Bias RG, Mayhew DJ. Cost-justifying usability: An update for the Internet age. 2nd ed. San Francisco, CA: Morgan Kaufmann Publishers; 2005.
Burns M, Manning H, Petersen J. The business impact of customer experience, 2012. Business case: The experience-driven organization playbook. Cambridge, MA: Forrester; 2012. http://www.forrester.com/The+Business+Impact+Of+Customer+Experience+2012/fulltext/-/E-RES61251.
Farrell S, Nielsen J. User experience career advice: How to learn UX and get a job. 2014. Retrieved from http://www.nngroup.com/reports/user-experience-careers/.
Forrester’s North American Technographics Customer Experience Online Survey, Q4. 2011 (US).
Gould JD, Lewis C. Designing for usability: key principles and what designers think. Communications of the ACM. 1985;2(3):300–311.
Hackos JT, Redish JC. User and Task Analysis for Interface Design. New York: John Wiley & Sons; 1998.
IBM. Cost justifying ease of use: Complex solutions are problems. 2001. October 9, 2001. Available at www-3.ibm.com/ibm/easy/eou_ext.nsf/Publish/23.
Johnson J. GUI bloopers 2.0: Common user interface design don’ts and dos. Morgan Kaufmann; 2008.
Keeley L, Walters H, Pikkel R, Quinn B. Ten types of innovation: The discipline of building breakthroughs. Hoboken, NJ: John Wiley & Sons; 2013.
Lederer AL, Prasad J. Nine management guidelines for better cost estimating. Communications of the ACM. 1992;35(2):51–59.
Manning H, Bodine K. Outside in: The power of putting customers at the center of your business. New York: Houghton Mifflin Harcourt; 2012.
Marcus A. Return on investment for usable UI design, user experience, Winter, 25–31. Bloomingdale, IL: Usability Professionals’ Association; 2002.
Nielsen J. Why you only need to test with 5 users. 2000. Retrieved from http://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/.
Norman DA. Words matter. Talk about people: not customers, not consumers, not users. Interactions. 2006;13(5):49–63.
Pressman RS. Software engineering: A practitioner’s approach. New York: McGraw-Hill; 1992.
Rhodes J. Usability can save your company. 2000. [Webpage] Retrieved from http://webword.com/moving/savecompany.html.
Sharon T. It’s Our Research. Morgan Kaufmann; 2012.
Stone M. Back to basics. 2013. [Blog post] Retrieved from http://mariastonemashka123.wordpress.com/.
Weigers KE. Software Requirements. Redmond, WA: Microsoft Press; 1999.
Weinberg J. Quality Software Management. Vol. 4: Anticipating change. New York: Dorset House; 1997.
Chapter 2: Before You Choose an Activity: Learning About Your Product Users
AARP. Beyond 50.09 chronic care: A call to action for health reform. 2009. Retrieved from http://www.aarp.org/health/medicare-insurance/info-03-2009/beyond_50_hcr.html.
Benedek J, Miner T. Measuring desirability: New methods for evaluating desirability in a usability lab setting. In: Proceedings of UPA 2002 Conference, Orlando, FL; 2002.
Chavan AL, Munshi S. Emotion in a ticket. In: CHI’04 extended abstracts on human factors in computing systems. New York, NY, USA: ACM; 2004:1544 1544.
Chavan AL, Prabhu GV, eds. Innovative solutions: What designers need to know for today’s emerging markets. CRC Press; 2010.
Costa T, Dalton J, Gillett FE, Gill M, Campbell C, Silk D. Build seamless experiences now: Experience persistence transforms fragmented interactions into a unified system of engagement. Forrester; 2013. Retrieved from http://www.forrester.com/Build+Seamless+Experiences+Now/fulltext/-/E-RES97021.
Department of Health and Human Services, Administration on Aging. Trends in the older population using Census 2000, Estimates 2001–2009, Census 2010. Retrieved from: http://www.aoa.gov/AoAroot/Aging_Statistics/Census_Population/census2010/docs/Trends_Older_Pop.xls. 2010.
Fisk AD, Rogers WA, Charness N, Czaja SJ, Sharit J. Designing for older adults: Principles and creative human factors approaches. 2nd ed. Boca Raton, FL: CRC; 2009.
Folstein MF, Folstein SE, McHugh PR. Mini-mental state: A practical method for grading the cognitive state of patients for the clinician. Journal of Psychiatric Research. 1975;12(3):189–198.
Geert H, Jan HG. Cultures and organizations: Software of the mind. New York: McGraw-Hill; 1991.
Hall ET. Beyond culture. Random House LLC; 1989.
Mace R. Universal design: Barrier free environments for everyone. Designers West. 1985;33(1):147–152.
McInerney P. Getting More from UCD Scenarios. Paper for IBM MITE. Available at: 2003. http://www-306.ibm.com/ibm/easy/eou_ext.nsf/Publish/50?OpenDocument&./Publish/1111/$File/paper1111.pdf.
Pew Internet Research Project Social Networking Media Fact Sheet. Retrieved from: http://www.pewinternet.org/fact-sheets/social-networking-fact-sheet/. April 27, 2014 on.
Plocher T, Chavan A. User needs research special interest group. In: CHI’02 extended abstracts on human factors in computing systems. New York, NY, USA: ACM; 2002.
Snider JG, Osgood CE, eds. Semantic differential technique; a sourcebook. Hawthorne, NY: Aldine Pub. Co.; 1969.
Story M, Mace R, Mueller J. The universal design file: Designing for people of all ages and abilities. Raleigh, NC: Center for Universal Design, NC State University; 1998.
Chapter 5: Choosing a User Experience Research Activity
Bernard HR. Social research methods. Thousand Oaks, CA: Sage; 2000.
Creswell JW. Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks, CA: Sage Publications; 1998.
Fishbein M, Ajzen I. Belief, attitude, intention, and behavior: An introduction to theory and research. Reading, MA: Addison-Wesley; 1975.
Food and Drug Administration. Draft guidance for industry and Food and Drug Administration staff—Applying human factors and usability engineering to optimize medical device design. Silver Spring, MD: U.S. Food and Drug Administration; 2014.
Green J, Thorogood N. Qualitative methods for health research. 2nd ed. Thousand Oaks, CA: Sage; 2009.
Guest G, Bunce A, Johnson L. How many interviews are enough? Field Methods. 2006;18(1):59–82. doi:10.1177/1525822X05279903 [Webpage] Retrieved from.
Hektner JM, Schmidt JA, Csikszentmihalyi M. Experience sampling method: Measuring the quality of everyday life. Thousand Oaks, CA: Sage; 2007.
Hwang W, Salvendy G. Number of people required for usability evaluation: the 10 ± 2 rule. Communications of the ACM. 2010;53(5):130–133.
Krueger RA, Casey MA. Focus groups: A practical guide for applied research. Thousand Oaks, CA: Sage Publications Inc. 2000.
Morse JM. Designing funded qualitative research. In: Denzin NK, Lincoln YS, eds. Handbook of qualitative research. 2nd ed. Thousand Oaks, CA: Sage; 1994:220–235.
Nielsen J. Estimating the number of subjects needed for a thinking aloud test. International Journal of Human-Computer Studies. 1994;41:385–397.
Nielsen J. Why you only need to test with 5 users. Alertbox. Available at: www.useit.com/alertbox/20000319.html. 2000.
Quesenbery W, Szuc D. Choosing the right usability tool. 2005. Retrieved from: http://www.wqusability.com/handouts/right-tool.pdf.
Sauro J, Lewis JR. Quantifying the user experience: Practical statistics for user research. Burlington: Elsevier; 2012.
Sears A, Jacko J. The human-computer interaction handbook: Fundamentals, evolving technologies. Boca Raton, FL: CRC Press; 2012.
Tullis T, Wood L. How many users are enough for a card-sorting study? In: Proceedings UPA’2004 (Minneapolis, MN, June 7–11, 2004); 2004.
Chapter 6: Preparing for Your User Research Activity
Bohannon J. Science. 2011 334.
Buhrmester M, Kwang T, Gosling S. Amazon’s mechanical Turk: a new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science. 2011;6:3.
Casler K, Bickel L, Hackett E. Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Journal of Computers in Human Behavior. 2013;29(6):2156–2160.
Dray S, Mrazek D. A day in the life of a family: An international ethnographic study. In: Wixon DR, Ramey J, eds. Field methods casebook for software design. New York: John Wiley & Sons; 1996.
Kittur A, Chi E, Suh B. Crowdsourcing user studies with Mechanical Turk. In: Proceedings of the SIGCHI conference on human factors in computing systems; 2008:453–456.
Chapter 7: During Your User Research Activity
Boren MT, Ramey J. Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication. 2000;43(3):261–278.
Dumas JS, Redish JC. A practical guide to usability testing. 2nd ed. Exeter, England: Intellect Books; 1999.
Nisbett RE, Wilson TD. Telling more than we can know: Verbal reports on mental processes. Psychological Review. 1977;84(3):231–259.
Chapter 8: Diary Studies
Allport GW. The use of personal documents in psychological science. New York: Social Science Research Council; 1942.
Engelberger JF. Robotics in practice: Future capabilities. Electronic Servicing & Technology magazine; 1982.
Hackos JT, Redish JC. User and task analysis for interface design. New York: John Wiley & Sons; 1998.
Kahneman D, Krueger AB, Schkade D, Schwarz N, Stone AA. A survey method for characterizing daily life experience: The day reconstruction method. Science. 2004;306:1776–1780.
Kahneman D. Thinking, fast and slow. New York: Farrar, Strauss, Giroux; 2011.
Larson R, Csikszentmihalyi M. The experience sampling method. In: Reis HT, ed. San Francisco: Jossey-Bass; 41–56. Naturalistic Approaches to Studying Social Interaction. New Directions for Methodology of Social and Behavioral Science. 1983;Vol. 15.
Yue Z, Litt E, Cai CJ, Stern J, Baxter KK, Guan Z, et al. Photographing information needs: the role of photos in experience sampling method-style research. In: Proceedings of the 32nd annual ACM conference on human factors in computing systems. ACM; 2014, April:1545–1554.
Chapter 9: Interviews
Alreck PL, Settle RB. The survey research handbook. 2nd ed. Burr Ridge, IL: Irwin Professional Publishing; 1995.
Amato PR. The consequences of divorce for adults and children. Journal of Marriage and the Family. 2000;62(4):1269–1287.
Boren MT, Ramey J. Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication. 2000;43:261–278.
Census US. Children living apart from parents—Characteristics of children under 18 and designated parents. 2006.
Census US. Household relationship and living arrangements of children under 18 years, by age and sex. 2008.
De Swert K. Calculating inter-coder reliability in media content analysis using Krippendorff’s Alpha. 2012 Available online: http://www.polcomm.org/wp-content/uploads/ICR01022012.pdf.
Dumas JS, Redish JC. A practical guide to usability testing. 2nd ed. Exeter, England: Intellect Books; 1999.
Green J, Thorogood N. Qualitative methods for health research. 2nd Thousand Oaks, CA: Sage; 2009.
Guest G, Bunce A, Johnson L. How many interviews are enough? An experiment with data saturation and variability. Field Methods. 2006;18:59–82.
Johnson T, Hougland J, Clayton R. Obtaining reports of sensitive behavior: A comparison of substance use reports from telephone and face-to-face interviews. Social Science Quarterly. 1989;70(1):173–183.
Krosnick JA. Survey research. Annual Review of Psychology. 1999;50:537–567.
Landis JR, Koch GG. The measurement of observer agreement for categorical. Biometrics. 1977;33:159–174.
Shefts KR. Virtual visitation: The next generation of options for parent-child communication. Family Law Quarterly. 2002;36(2):303–327.
Stafford M. Communication competencies and sociocultural priorities of middle childhood. In: Handbook of family communication. Mahwah, NJ: Lawrence Erlbaum Associates; 2004:311–332.
Yarosh S, Chew YC, Abowd GD. Supporting parent-child communication in divorced families. International Journal of Human-Computer Studies. 2009;67(2):192–203.
Yarosh S, Abowd GD. Mediated parent-child contact in work-separated families. In: ACM; 2011:1185–1194. Proc. of CHI.
Yarosh S. Conflict in families as an ethical and methodological consideration. In: Judge TK, Neustaedter C, eds. Evaluating and designing for domestic life: research methods for human-computer interaction. Springer Publishers; 2014.
Chapter 10: Surveys
Callegaro M, Baker RP, Bethlehem J, Göritz AS, Krosnick JA, Lavrakas PJ, eds. Online panel research: A data quality perspective. John Wiley & Sons; 2014.
Chang L, Krosnick JA. National surveys via RDD telephone interviewing versus the Internet comparing sample representativeness and response quality. Public Opinion Quarterly. 2009;73(4):641–678.
Couper M. Designing effective web surveys. Cambridge: Cambridge University Press; 2008.
Crow D, Johnson M, Hanneman R. Benefits—and costs—of a multi-mode survey of recent college graduates. Survey Practice. 2011;4(5).
Dillman DA, Smyth JD, Christian LM. Internet, mail, and mixed-mode surveys: The tailored design method. 3rd ed. Hoboken, NJ: John Wiley and Sons; 2009.
Greenlaw C, Brown-Welty S. A comparison of web-based and paper-based survey methods testing assumptions of survey mode and response cost. Evaluation Review. 2009;33(5):464–480.
Groves RM, Dilman DA, Eltinge JL, Little RJA. Survey nonresponse in design, data collection, and analysis. In: Groves RM, Dilman DA, Eltinge JL, Roderick J.A. Little, eds. Survey nonresponse. New York: John Wiley and Sons; 2002.
Holbrook AL, Green MC, Krosnick JA. Telephone versus face-to-face interviewing of national probability samples with long questionnaires: Comparisons of respondent satisficing and social desirability response bias. Public Opinion Quarterly. 2003;67(1):79–125.
Holbrook AL, Krosnick JA, Pfent A. Response rates in surveys by the news media and government contractor survey research firms. In: Lepkowski J, Harris-Kojetin B, Lavrakas PJ, eds. Advances in telephone survey methodology. New York, NY: Wiley; 2007:499–528.
Krosnick JA, Li F, Lehman DR. Conversational conventions, order of information acquisition, and the effect of base rates and individuating information on social judgments. Journal of Personality and Social Psychology. 1990;59(6):1140.
Krosnick J. Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology. 1991;5:213–236.
Krosnick J, Narayan S, Smith W. Satisficing in surveys: Initial evidence. New Directions for Evaluation. 1996;1996(70):29–44.
Krosnick J, Fabrigar L. Designing rating scales for effective measurement in surveys. In: Survey measurement and process quality. 1997:141–164.
Krosnick J. Survey research. Annual Review of Psychology. 1999;50(1):537–567.
Krosnick JA, Tahk AM. The optimal length of rating scales to maximize reliability and validity. California: Stanford University; 2008 Unpublished manuscript.
Krosnick J, Presser S. Question and questionnaire design. In: Handbook of survey research. 2nd ed Bingley, UK: Emerald; 2010:263–314.
Landon E. Order bias, the ideal rating, and the semantic differential. Journal of Marketing Research. 1971;8(3):375–378.
Müller H, Sedley A, Ferrall-Nunge E. Survey research in HCI. In: Ways of knowing in HCI. New York: Springer; 2014:229–266.
Saris WE, Revilla M, Krosnick JA, Shaeffer E. Comparing questions with agree/disagree response options to questions with item-specific response options. Survey Research Methods. 2010;4(1).
Sedley A, Müller H. Minimizing change aversion for the google drive launch. In: CHI’13 extended abstracts on human factors in computing systems. ACM; 2013:2351–2354.
Schlenker B, Weigold M. Goals and the self-identification process: Constructing desired identities. In: Goal concepts in personality and social psychology. 1989:243–290.
Schonlau M, Zapert K, Simon LP, Sanstad KH, Marcus SM, Adams J, et al. A comparison between responses from a propensity-weighted web survey and an identical RDD survey. Social Science Computer Review. 2004;22(1):128–138.
Smith D. Correcting for social desirability response sets in opinion-attitude survey research. The Public Opinion Quarterly. 1967;31(1):87–94.
Tourangeau R. Cognitive science and survey methods. In: Cognitive aspects of survey methodology: Building a bridge between disciplines. 1984:73–100.
Tourangeau R, Couper MP, Conrad F. Spacing, position, and order interpretive heuristics for visual features of survey questions. Public Opinion Quarterly. 2004;68(3):368–393.
Vannette DL, Krosnick JA. A comparison of survey satisficing and mindlessness. The Wiley Blackwell handbook of mindfulness. 2014;1(312) Wiley-Blackwell.
Villar A, Callegaro M, Yang Y. Where am I? A meta-analysis of experiments on the effects of progress indicators for web surveys. Social Science Computer Review. 2013;31(6):744–762.
Weisberg HF. The total survey error approach: A guide to the new science of survey research. Chicago: The University of Chicago Press; 2005.
Wildt AR, Mazis MB. Determinants of scale response: Label versus position. Journal of Marketing Research. 1978;15:261–267.
Yeager DS, Krosnick JA, Chang L, Javitz AS, Levendusky MS, Simpser A, et al. Comparing the accuracy of RDD telephone surveys and Internet surveys conducted with probability and non-probability samples. Public Opinion Quarterly. 2011;75(4):709–747.
Chapter 11: Card Sorting
Nielsen J, Sano D. SunWeb: User interface design for Sun Microsystem’s internal web. In: Proceedings of the 2nd world wide web conference ‘94: Mosaic and the web, Chicago, IL, 17-20 October; 1994:547–557. Available at http://archive.ncsa.uiuc.edu/SDG/IT94/Proceedings/HCI/nielsen/sunweb.html.
Nielsen Norman Group. Intranet Design Annual: 2013. Nielsen Norman Group. Web. 27. (March 2014):2014.
Spencer D. Card sorting: Designing usable categories. Brooklyn, New York: Rosenfeld Media, LLC; 2009 82. Print.
Spencer D. A practical guide to information architecture. Penarth, UK: Five Simple Steps; 2010.
Tullis TS. Designing a menu-based interface to an operating system. In: CHI ‘85 proceedings, San Francisco, CA; 1985:79–84.
Tullis T, Wood L. How many users are enough for a card-sorting study? In: Proceedings of the Usability Professionals’ Association 2004 conference, Minneapolis, MN, 7-11 June (CD-ROM); 2004.
Zavod MJ, Rickert DE, Brown SH. The automated card-sort as an interface design tool: A comparison of products. In: Proceedings of the Human Factors and Ergonomics Society 46th annual meeting, Baltimore, MD, 30 September-4 October; 2002:646–650.
Chapter 12: Focus Groups
Dolan W, Wiklund M, Logan R, Augaitis S. Participatory design shapes future of telephone handsets. In: Proceedings of the Human Factors and Ergonomics Society 39th annual meeting. San Diego, CA, 9–13 October; 1995:331–335.
Dumas JS, Redish JC. A practical guide to usability testing. 2nd Exeter, England: Intellect Books; 1999.
Gray BG, Barfield W, Haselkorn M, Spyridakis J, Conquest L. The design of a graphics-based traffic information system based on user requirements. In: Proceedings of the Human Factors and Ergonomics Society 34th annual meeting. Orlando, FL, 8–12 October; 1990:603–606.
Hackos JT, Redish JC. User and task analysis for interface design. New York: John Wiley & Sons; 1998.
Karlin JE, Klemmer ET. An interview. In: Klemmer ET, ed. Ergonomics: Harness the power of human factors in your business. Norwood, NJ: Ablex; 1989:197–201.
Kelly T. The art of innovation. New York: DoubleDay; 2001.
Krueger R. Developing questions for focus groups. Thousand Oaks, CA: Sage Publications; 1998.
Krueger R, Casey MA. Focus groups: A practical guide for applied research. London: Sage Publications; 2000.
Root RW, Draper S. Questionnaires as a software evaluation tool. In: Proceedings of the ACM CHI conference. Boston, MA, 12–15 December; 1983:83–87.
Sato S, Salvador T. Playacting and focus troupe: Theater techniques for creating quick, intense, immersive, and engaging focus groups sessions. Interactions. 1999;6(5):35–41.
Schindler RM. The real lesson of new coke: The value of focus groups for predicting the effects of social influence. Marketing Research: A Magazine of Management and Applications. 1992;4:22–27.
Chapter 13: Field Studies
Beyer H, Holtzblatt K. Contextual design: Defining customer-centered systems. San Francisco: Morgan Kaufmann; 1998.
Brooke T, Burrell J. From ethnography to design in a vineyard. In: DUX 2003 Proceedings, San Francisco, CA; 2003:1–4. http://www.aiga.org/resources/content/9/7/8/documents/brooke.pdf.
Creswell JW. Research design: Qualitative, quantitative and mixed methods approaches. 2nd ed. 2003.
Dumas JS, Salzman MC. Reviews of Human Factors and Ergonomics. 2006;2:109.
Hackos JT, Redish JC. User and task analysis for interface design. New York: JohnWiley & Sons; 1998.
Kirah A, Fuson C, Grudin J, Feldman E. Usability assessment methods. In: Bias RG, Mayhew DJ, eds. Cost-justifying usability: An update for an Internet age. 2005.
Laakso SA, Laakso K, Page C. DUO: A discount observation method. 2001. [Webpage] Retrieved from www.cs.helsinki.fi/u/salaakso/papers/DUO.pdf.
Landsberger HA. Hawthorne Revisited. Ithaca 1958.
Ramey J, Rowberg AH, Robinson C. Adaptation of an ethnographic method for investigation of the task domain in diagnostic radiology. In: Wixon DR, Ramey J, eds. Field methods casebook for software design. New York: John Wiley & Sons; 1996:1–15.
Strauss AL, Corbin J. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Newbury Park, CA: Sage Publications; 1990.
Teague R, Bell G. Getting Out of the Box. Ethnography meets life: Applying anthropological techniques to experience research. In: Proceedings of the Usability Professionals’ Association 2001 Conference, Las Vegas, NV (Tutorial); 2001.
Chapter 14: Evaluation Methods
Benedek J, Miner T. Measuring desirability: New methods for evaluating desirability in a usability lab setting. In: Proceedings of Usability Professionals Association, 2003; 2002:8–12.
Borsci S, Macredie RD, Barnett J, Martin J, Kuljis J, Young T. Reviewing and extending the five-user assumption: A grounded procedure for interaction evaluation. ACM Transactions on Computer-Human Interaction. 2013. ;20(5):29. doi:10.1145/2506210. Retrieved from http://doi.acm.org/10.1145/2506210.
Jacobsen NE, John BE. Two case studies in using cognitive walkthrough for interface evaluation. In: Pittsburgh, PA: Carnegie Mellon University, School of Computer Science; 2000 No. CMU-CS-00-132.
Kim B, Dong Y, Kim S, Lee KP. Development of integrated analysis system and tool of perception, recognition, and behavior for web usability test: With emphasis on eye-tracking, mouse-tracking, and retrospective think aloud. In: Usability and internationalization. HCI and culture. Berlin, Heidelberg: Springer; 2007:113–121.
Lewis C, Polson P, Wharton C, Rieman J. Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In: CHI ’90 Proceedings; ACM; 1990:235–242.
Medlock MC, Wixon D, Terrano M, Romero R, Fulton B. Using the RITE method to improve products: A definition and a case study. Usability Professionals Association; 2002.
Nielsen J. Usability engineering at a discount. Proceedings of the third international conference on human-computer interaction on designing and using human-computer interfaces and knowledge based systems. 2nd Elsevier Science Inc; 1989 [Blog post] Retrieved from.
Nielsen J. Heuristic evaluation. In: Nielsen J, Mack RL, eds. Usability inspection methods. New York, NY: John Wiley & Sons; 1994.
Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. In: Proceedings of the INTERACT’93 and CHI’93 conference on human factors in computing systems, ACM; 1993:206–213.
Nielsen J, Molich R. Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM; 1990:249–256.
Norman DA. Emotional design: Why we love (or hate) everyday things. New York: Basic Books; 2004.
Polson PG, Lewis C, Rieman J, Wharton C. Cognitive walkthroughs: A method for theory-based evaluation of user interfaces. International Journal of Man-Machine Studies. 1992;36(5):741–773.
Rayner K. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin. 1998;124(3):372.
Russell DM, Chi EH. Looking back: Retrospective study methods for HCI. Ways of knowing in HCI. New York: Springer; 2014 pp. 373–393.
Sauro J. A brief history of the magic number 5 in usability testing. 2010. Retrieved from https://www.measuringusability.com/blog/five-history.php.
Tobii Technology. Retrospective think aloud and eye tracking: Comparing the value of different cues when using the retrospective think aloud method in web usability testing. 2009. Retrieved from http://www.tobii.com/Global/Analysis/Training/WhitePapers/Tobii_RTA_and_EyeTracking_WhitePaper.pdf.
Chapter 15: Concluding Final
Heath C, Heath D. Made to stick. Random House; 2007.
McQuaid HL. Developing guidelines for judging the cost and benefit of fixing usability problems. In: Proceedings of the usability professionals’ association 2002 conference (CD-ROM); 2002.
3.147.205.178