References

1. Adlin T, Pruitt J. The essential persona lifecycle: Your guide to building and using personas San Francisco, CA: Morgan Kaufmann; 2010.

2. Akers D, Jeffries R, Simpson M, Winograd T. Backtracking events as indicators of usability problems in creation-oriented applications. ACM Transactions on Computer-Human Interaction. 2012;19 Article 16 (July 2012).

3. Bailey, R. (1999). Heuristic evaluation. UI Design Newsletter—May, 1999. Retrieved August 30, 2013, from <http://www.humanfactors.com/downloads/may99.asp>.

4. Bailey, R. W., Allan, R. W., & Raiello, P. (1992). Usability testing vs. heuristic evaluation: A head-to-head comparison. Proceedings of the 36th annual human factors society meeting (pp. 409–413). Atlanta, GA.

5. Baker K, Greenberg S, Gutwin C. Empirical development of a heuristic evaluation methodology for shared workspace groupware. Proceedings of the 2002 ACM conference on computer supported cooperative work (CSCW’02) New York, NY: ACM; 2002; pp. 96–105.

6. Basili V, Green S, Laitenberger O, Shull F, Sorumgaard S, Zelkowitz M. The empirical investigation of perspective-based reading. Empirical Software Engineering: An International Journal. 1996;1(2):133–164.

7. Bias R. Interface—walkthroughs: Efficient collaborative testing. IEEE Software. 1991;8(5):58–59.

8. Bias RG. The pluralistic usability walkthrough: Coordinated empathies. In: Nielsen J, Mack R, eds. Usability inspection methods. New York, NY: John Wiley; 1994;:63–76.

9. Bias RG, Mayhew DJ. Cost-justifying usability: An update for the internet age 2nd ed. San Francisco, CA: Morgan Kaufmann; 2005.

10. Boehm B, Basili V. Software defect reduction top 10 list. IEEE Computer. 2001;34(1):135–137.

11. Buley L. The user experience team of one: A research and design survival guide Brooklyn, NY: Rosenfeld Media; 2013.

12. Chattratichart J, Lindgaard G. A comparative evaluation of heuristic-based usability inspection methods. CHI’08 extended abstracts on human factors in computing systems (CHI EA’08) New York, NY: ACM; 2008; pp. 2213–2220.

13. Chisnell, D., Redish, G., & Lee, A. (2006). New heuristics for understanding older adults as web users. Retrieved August 11, 2013, from <http://www.usabilityworks.net/resources/chisnell_redish_lee_heuristics.pdf>.

14. Cockton G, Woolrych A, Hornbæk K, Frøkjær E. Inspection-based methods. In: Sears A, Jacko JA, eds. The human-computer interaction handbook: Fundamentals, evolving technologies and emerging applications. 3rd ed. Boca Raton, FL: CRC Press; 2012;1275–1293.

15. Constantine LL. Collaborative usability inspections for software. Proceedings from software development ’94 San Francisco, CA: Miller Freeman; 1994.

16. Constantine LL, Lockwood LA. Software for use: A practical guide to the models and methods of usage centered design New York, NY: Addison-Wesley; 1999; (pp. 400).

17. Desurvire H. Faster, cheaper: Are usability inspection methods as effective as empirical testing? New York, NY: John Wiley & Sons; 1994.

18. Desurvire H, Wiberg C. Master of the game: Assessing approachability in future game design.CHI’08 extended abstracts on human factors in computing systems (CHI EA’08) New York, NY: ACM; 2008; pp. 3177–3182.

19. Doubleday A, Ryan M, Springett M, Sutcliffe A. A comparison of usability techniques for evaluating design. In: Coles S, ed. Proceedings of the second conference on designing interactive systems: Processes, practices, methods, and techniques (DIS’97). New York, NY: ACM; 1997;101–110.

20. Dumas J, Redish J. A practical guide to usability testing Revised ed. Exeter, UK: Intellect; 1999.

21. FAA. (n.d.). Formal usability evaluation. Retrieved August 13, 2013, from <http://www.hf.faa.gov/workbenchtools/default.aspx?rPage=Tooldetails&subCatId=13&toolID=78>.

22. Fagan M. Design and code inspections to reduce errors in program development. IBM Systems Journal. 1976;15(3):182–211.

23. Freedman DP, Weinberg GW. Handbook of walkthroughs, inspections, and technical reviews: Evaluating programs, projects, and products 3rd ed. New York, NY: Dorset House Publishing; 1990.

24. Gerhardt-Powals J. Cognitive engineering principles for enhancing human-computer performance. International Journal of Human-Computer Interaction. 1996;8(2):189–221.

25. Gilb T, Graham D. Software inspection London: Addison-Wesley Longman; 1993.

26. Grigoreanu V, Mohanna M. Informal cognitive walkthroughs (ICW): Paring down and pairing up for an agile world. Proceedings of the 2013 ACM annual conference: Human factors in computing systems (CHI’13) New York, NY: ACM Press; 2013; pp. 3093–3096.

27. Grossman T, Fitzmaurice G, Attar R. A survey of software learnability: Metrics, methodologies and guidelines. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’09) New York, NY: ACM; 2009; pp. 649–658.

28. Grudin J. The case against user interface consistency. Communications of the ACM. 1989;32(10):1164–1173.

29. Gunn, C. (1995). An example of formal usability inspections in practice at Hewlett-Packard company. Proceeding of the conference on human factors in computing system (CHI’95) (pp. 103–104). Denver, CO, May 7–11, 1995.

30. Hartson HR, Andre TS, Williges RC. Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction. 2003;15(1):145–181.

31. Hartson R, Pyla P. The UX book: Process and guidelines for ensuring a quality user interface Waltham, MA: Morgan Kaufmann; 2012.

32. Hertzum M, Jacobsen NE. The evaluator effect: A chilling fact about usability evaluation methods. International Journal of Human-Computer Interaction. 2001;13.

33. Hertzum M, Jacobsen NE, Molich R. Usability inspections by groups of specialists: Perceived agreement in spite of disparate observations. Extended abstracts on human factors in computing systems (CHI EA’02) New York, NY: ACM Press; 2002; pp. 662–663.

34. Hornbæk K, Frøkjaer E. Comparing usability problems and redesign proposal as input to practical systems development. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’05) New York, NY: ACM; 2005; pp. 391–400.

35. Hwang W, Salvendy G. Number of people required for usability evaluation: The 10±2 rule. Communications of the ACM. 2010;53(5):130–133.

36. IEEE Std 1028-2008. IEEE standard for software reviews. IEEE standards software engineering New York, NY: The Institute of Electrical and Electronics Engineering; 2008.

37. Jacobsen NE, Hertzum M, John BE. The evaluator effect in usability studies: Problem detection and severity judgments. Proceedings of the Human Factors and Ergonomics Society 1998a;1336–1340.

38. Jacobsen NE, Hertzum M, John BE. The evaluator effect in usability tests. CHI’98 conference summary on human factors in computing systems (CHI’98) New York, NY: ACM; 1998b; pp. 255–256.

39. Jaferian P, Hawkey K, Sotirakopoulos A, Velez-Rojas M, Beznosov K. Heuristics for evaluating IT security management tools. Proceedings of the seventh symposium on usable privacy and security (SOUPS’11) New York, NY: ACM; 2011; Article 7, 20 pages.

40. Jeffries, R., Miller, J. R., Wharton, C., & Uyeda, K. M. (1991). User interface evaluation in the real world: A comparison of four techniques. Proceedings from ACM CHI’91 conference (pp. 119–124). New Orleans, LA, April 28–May 2, 1991.

41. John, B. E., & Packer, H. (1995). Learning and using the cognitive walkthrough method: A case study approach. Proceedings of CHI’95 (pp. 429–436). Denver, CO, May 1995. New York, NY: ACM Press.

42. Kahn MK, Prail A. Formal usability inspections. In: Nielsen J, Mack RL, eds. Usability inspection methods. New York, NY: John Wiley & Sons; 1994;141–171.

43. Kantner, L., Shroyer, R., & Rosenbaum, S. (2002). Structured heuristic evaluation of online documentation. Proceedings of IEEE international professional communication conference (IPCC 2002) (Portland, OR, USA, September 17–20, 2002).

44. Karoulis A, Pombortsis AS. Heuristic evaluation of web-sites: The evaluators’ expertise and the appropriate criteria list. Informatics in Education. 2004;3(2):55–74.

45. Kirmani S, Rajasekarn S. Heuristic evaluation quality score (HEQS): A measure of heuristic evaluation skills. Journal of Usability Studies. 2007;2(2):61–75.

46. Kotval XP, Coyle CL, Santos PA, Vaughn H, Iden R. Heuristic evaluations at Bell labs: Analyses of evaluator overlap and group session. Extended Abstracts on Human Factors in Computing Systems (CHI EA’07) New York, NY: ACM Press; 2007; pp. 1729–1734.

47. Krug S. Rocket surgery made easy: The do-it-yourself guide to finding and fixing usability problems Berkeley, CA: New Riders; 2009.

48. Kurosu, M., Matsuura, S., Sugizaki, M. (1997). Categorical inspection method—structured heuristic evaluation (sHEM). IEEE International Conference on Systems, Man, and Cybernetics (pp. 2613–2618). Piscataway, NJ: IEEE.

49. Laitenberger O, Atkinson C. Generalizing perspective-based inspection to handle object-oriented development artifacts. Proceedings of the 21st international conference on software engineering (ICSE’99) New York, NY: ACM; 1999; pp. 494–503.

50. Lauesen S. User interface design: A software engineering perspective Harlow, England: Addison-Wesley; 2005.

51. Lewis C, Polson P, Wharton C, Rieman J. Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In: Carrasco J, Whiteside J, eds. Proceedings of ACM CHI’90: Conference on human factors in computer systems. New York, NY: ACM Press; 1990;235–242.

52. Lewis C, Wharton C. Cognitive walkthroughs. In: Helander M, Landauer TK, Prabhu P, eds. Handbook of human-computer interaction. 2nd ed. Amsterdam, the Netherlands: Elsevier Science; 1997;717–732.

53. Linderman M, Fried J. Defensive design for the web: How to improve error messages, help, forms, and other crisis points Berkeley, CA: New Riders; 2004.

54. Mahatody T, Sagar M, Kolski C. State of the art on the cognitive walkthrough method, its variants and evolutions. International Journal of Human Computer Interaction. 2010;26(8):741–785.

55. Mankoff J, Dey AK, Hsieh G, Kientz J, Lederer S, Ames M. Heuristic evaluation of ambient displays. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’03) New York, NY: ACM; 2003; pp. 169–176.

56. Markopoulos P, Read JC, MacFarlane S, Hoysniemi J. Evaluating children’s interactive products: Principles and practices for interaction designers Burlington, MA: Morgan Kaufmann; 2008.

57. Molich, R. (2011). Comparative usability evaluation reports. Retrieved August 10, 2013, from <http://www.dialogdesign.dk/CUE-9.htm>.

58. Molich, R. (2013). Usability testing myths. Retrieved July 25, 2013, from <http://www.netmagazine.com/features/usability-testing-myths>.

59. Molich R, Dumas JS. Comparative usability evaluation (CUE-4). Behaviour and Information Technology. 2008;27:3.

60. Monk A, Wright P, Haber J, Davenport L. Improving your human-computer interface: A practical technique Prentice Hall International (UK) Ltd 1993.

61. Muller MJ, Matheson L, Page C, Gallup R. Methods & tools: Participatory heuristic evaluation. Interactions. 1998;5(5):13.

62. Nielesen J, Tahir M. Homepage usability: 50 websites deconstructed Berkeley, CA: New Riders; 2002.

63. Nielsen J. Finding usability problems through heuristic evaluation. In: Bauersfeld P, Bennett J, Lynch G, eds. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’92). New York, NY: ACM Press; 1992;373–380.

64. Nielsen J. Usability engineering San Francisco, CA: Morgan Kaufmann; 1993.

65. Nielsen J. Heuristic evaluation. In: Nielsen J, Mack RL, eds. Usability inspection methods. New York, NY: Wiley; 1994a.

66. Nielsen J. Enhancing the explanatory power of usability heuristics. In: Adelson B, Dumais S, Olson J, eds. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’94). New York, NY: ACM Press; 1994b;152–158.

67. Nielsen J, Mack R, eds. Usability inspection methods. New York, NY: Wiley; 1994.

68. Nielsen J, Molich R. Heuristic evaluation of user interfaces. In: Carrasco Chew J, Whiteside J, eds. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’90). New York, NY: ACM Press; 1990;249–256.

69. Novick DG. Using the cognitive walkthrough for operating procedures. Interactions. 1999;6(3):31–37.

70. Polson P, Lewis CH. Theory-based design for easily learned interfaces. Human-computer interaction. Vol. 5 Hillsdale, NJ: Lawrence Erlbaum Associates; 1990; pp. 191–220.

71. Pruitt J, Adlin T. The persona lifecycle: Keeping people in mind throughout product design San Francisco, CA: Morgan Kaufmann Publishers; 2005.

72. Purho, V. (2000). Heuristic inspections for documentation: 10 recommended documentation heuristics. Retrieved August 1, 2013, from <http://www.stcsig.org/usability/newsletter/0004-docsheuristics.html>.

73. Rosenbaum S, Rohn J, Humberg J. A toolkit for strategic usability: Results from workshops, panels, and surveys. Proceedings of CHI 2000 New York, NY: ACM Press; 2000; pp. 337–344.

74. Rowley DE, Rhoades DG. The cognitive jogthrough: A fast-paced user interface evaluation procedure. Proceedings of ACM CHI’92 conference on human factors in computing systems New York, NY: ACM Press; 1992; pp. 389–395.

75. Sawyer P, Flanders A, Wixon D. Making a difference—the impact of inspections. In: Tauber MJ, ed. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’96). New York, NY: ACM; 1996;376–382.

76. Schrage M. Serious play: How the world’s best companies simulate to innovate Boston, MA: Harvard Business School Press; 2000.

77. Sears A. Heuristic walkthroughs Finding the problems without the noise. International Journal of Human-Computer Interaction. 1997;9(3):213–234.

78. Sears A, Hess DJ. The effect of task description detail on evaluator performance with cognitive walkthroughs. Proceedings of ACM CHI’98: Conference on human factors in computing systems New York, NY: ACM Press; 1998; pp. 259–260.

79. Shneiderman B. Designing the user interface: Strategies for effective human-computer interaction Reading, MA: Addison-Wesley; 1987.

80. Shneiderman B, Plaisant C. Designing the user interface: Strategies for effective human-computer interaction 5th ed. Boston, MA: Addison-Wesley; 2010.

81. Shull F, Rus I, Basili V. How perspective-based reading can improve requirements inspections. IEEE Computer. 2000;33(7):73–79.

82. Slavkovic A, Cross K. Novice heuristic evaluations of a complex interface. CHI’99 extended abstracts on human factors in computing systems (CHI EA’99) New York, NY: ACM; 1999; pp. 304–305.

83. Spencer R. The streamlined cognitive walkthrough method, working around social constraints encountered in a software development company. Proceedings of the SIGCHI: Conference on human factors in computing systems (CHI’00) New York, NY: ACM Press; 2000; pp. 353–359.

84. Stacy W, MacMillan J. Cognitive bias in software engineering. Communications of the ACM. 1995;38(6):57–63.

85. Stone D, Jarrett C, Woodroffe M, Minocha S. User interface design and evaluation San Francisco, CA: Morgan Kaufmann; 2005.

86. Tsui KM, Abu-Zahra K, Casipe R, M’Sadoques J, Drury JL. Developing heuristics for assistive robotics. Proceedings of the fifth ACM/IEEE international conference on Human-robot interaction (HRI’10) Piscataway, NJ: IEEE Press; 2010; pp. 193–194.

87. Usability Body of Knowledge. (n.d.). Pluralistic usability walkthrough. Retrieved October 29, 2013 from <http://www.usabilitybok.org/pluralistic-walkthrough>.

88. Usability First Glossary. (n.d.). Retrieved June 14, 2004, from <http://www.usabilityfirst.com/glossary/term_306.txl>.

89. Virzi R. Usability inspection methods. In: Helander MG, Landauer TK, Prabhu PV, eds. Handbook of human-computer interaction. 2nd ed. Englewood Cliffs, NJ: Elsevier Science; 1997.

90. Weinschenk S, Barker D. Designing effective speech interfaces New York, NY: Wiley; 2000.

91. Wharton C, Bradford J, Jeffries J, Franzke M. Applying cognitive walkthroughs to more complex user interfaces: Experiences, issues and recommendations. In: Bauersfeld P, Bennett J, Lynch G, eds. Proceedings of the SIGCHI conference on human factors in computing systems (CHI’92). New York, NY: ACM; 1992;381–388.

92. Wharton C, Rieman J, Lewis C, Polson P. The cognitive walkthrough: A practitioner’s guide. In: Nielsen J, Mack RL, eds. Usability inspections methods. New York, NY: Wiley; 1994;105–140.

93. Wiegers KE. Peer reviews in software: A practical guide Boston, MA: Addison-Wesley; 2002.

94. Wilson, C. (2009). The consistency conundrum. Retrieved July 26, 2013, from <http://dux.typepad.com/dux/2009/03/the-consistency-conundrum.html>.

95. Wilson, C. (2011). Perspective-based inspection. Retrieved July 26, 2013, from <http://dux.typepad.com/dux/2011/03/method-10-of-100-perspective-based-inspection.html>.

96. Wilson CE. Triangulation: The explicit use of multiple methods, measures, and approaches for determining core issues in product development. Interactions. 2006;13(6):46-ff.

97. Wilson CE. The problem with usability problems: Context is critical. Interactions. 2007;14(5):46-ff.

98. Wixon D, Wilson CE. The usability engineering framework for product design and evaluation. Handbook of human-computer interaction 2nd ed. Amsterdam, the Netherlands: Elsevier; 1998; pp. 653–688.

99. Yehuda H, McGinn J. Coming to terms: Comparing and combining the results of multiple evaluators performing heuristic evaluation. CHI’07: Extended abstracts on human factors in computing systems New York, NY: ACM Press; 2007; pp. 1899–1904.

100. Zhang Z, Basili V, Shneiderman B. Perspective-based usability inspection: An empirical validation of efficiency. Empirical Software Engineering. 1999;4(1):43–69.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.234.225