Preface

This book grew out of a need for a different kind of textbook. In 1994, Rosson developed an undergraduate course in human-computer interaction at Virginia Tech. The course was intended chiefly for computer science undergraduates, though from the start there was considerable interest from students in many other departments. It was originally created as a technical elective for students with specialized interests. But it has become quite popular; in 2000–2001, about 200 students took the course at Virginia Tech.

The course was designed to be project based. For most students, this course provides their only exposure to HCI. This made it important to integrate concepts and applications in requirements, design, and evaluation of interactive systems. Existing textbooks provide sound coverage of HCI concepts and techniques but offer little guidance for comprehensive semester-long project activities. As we developed and refined the necessary project materials, the specifications became more and more unwieldy and distinct from the text that students were required to read. We needed a single book that integrated key HCI concepts and techniques into an overarching framework for the development of interactive systems.

How This Book is Different

This book differs in several important ways from existing HCI textbooks (such as Shneiderman’s Designing the User Interface or Preece et al.’s Human-Computer Interaction). Our coverage of traditional HCI content is deliberately minimalist. We provide a broad view, but we do not attempt to be comprehensive. Instead we present material that we believe is either central to a general appreciation of human needs and preferences, or that provides crucial support for the analysis, design, and evaluation of effective interactive systems. For example, the book contains more content concerning requirements analysis, prototyping, and documentation design than is typical of textbooks in this area. But it also contains fewer pages on human perception and cognition.

The concepts and techniques of HCI are organized and presented through a series of tradeoffs. We use this rhetorical device to emphasize that there is never a single answer in the design and development of interactive computer systems. We wish to make it very clear from the start that students must think and reason about user needs. HCI guidelines are useful, but only in the hands of experts who know how to interpret and apply them to many different situations. Introducing HCI material as tradeoffs increases the level of abstraction; students who are hoping for simple answers may find this disconcerting, but it accurately reflects the state of knowledge about humans and their interaction needs.

The HCI content is integrated into a usability engineering framework on a chapter-by-chapter basis. We raise HCI issues and concerns as they normally would be encountered during project development. To some extent, this organization is artificial. It implies a waterfall that does not take place in practice (such as requirements to design to evaluation). But we feel that it is important for students in a project-based course to see immediately where and how the HCI issues apply to the analysis and design of interactive systems. The segmentation into chapters is necessary for pedagogical reasons.

The usability engineering framework is founded on the use of scenarios as a central representation for the analysis and design of use. A scenario describes an existing or envisioned system from the perspective of one or more users and includes a narration of their goals, plans, and reactions. Other usability engineering frameworks (e.g., Mayhew, 1999) make use of scenarios, but do not use them in the central and systematic way described in this book. The work with scenarios is complemented by many examples of claims analysis, a technique we have developed for documenting and reasoning about the pros and cons of design features.

Almost half of the book is devoted to a single, cumulating design case study. We use this example (a virtual science fair) to introduce and illustrate the scenario-based methods for requirements analysis, design, and evaluation. We have learned from experience that HCI students learn well from examples. We assume that students’ class projects will be cumulative; if so, the case study will be useful as a model. A large set of supporting materials (interview guides, testing materials, etc.) is included, because we have found that students new to HCI need considerable guidance in behavioral methods. This particular design case was selected to be simple and familiar, but also quite open-ended with respect to requirements analysis and the new tasks and interactions it can motivate. It is presented as an example application developed within a larger community network project, the MOOsburg system (moosburg.cs.vt.edu). Additional real world case studies will be available through the textbook Web site (www.mkp.com/ue-sbd).

Our minimalist presentation of established HCI concepts and techniques is complemented at times with modules reporting on both current and classic research studies, design methods, or other topics of interest. These inserts describe HCI activities or concerns that are interesting and relevant, but optional with respect to the main goals of the chapters. We expect to replace or extend these modules in future editions, with the hope of better matching the rapid pace of information technology development.

How To Use This Book

This book was designed for a one-semester course introducing HCI concepts and methods. We are assuming 14 weeks; the book has 10 chapters. This allows one week for most chapters, with more time spent on chapters of most interest to the instructor, and with time left for exams and reviews. The central material is presented in Chapters 18, so some instructors may choose to cover only these chapters. In our own course, the material related to design and evaluation is essential, so we devote extra time to this. A suggested 14-week schedule might be:

Week 1 Course overview and SBD introduction (Chapter 1)
Week 2 Requirements analysis (Chapter 2)
Week 3 Activity design (Chapter 3)
Week 4–5 Information design (Chapter 4)
Week 6–7 Interaction design (Chapter 5)
Week 8 Review and discussion, midterm exam
Week 9 Prototyping (Chapter 6)
Week 10–11 Usability evaluation (Chapter 7)
Week 12 Documentation (Chapter 8)
Week 13 Emerging interaction paradigms (Chapter 9)
Week 14 Usability in the real world, review, and discussion (Chapter 10)

The term project should be organized and initiated as soon as possible. We have found that three phases provide a good organization for the project: requirements analysis, design and prototype, and formative evaluation. Students need 3–4 weeks for each of these segments; for example, the first phase might be due during week 5, the second phase during week 9, and the final phase during week 14. The Project Ideas section at the end of each chapter describes portions of an online shopping project that has worked well in our classes; students have enough familiarity with shopping and with Web applications to make progress on this in the short amount of time available. More details regarding project specifications and evaluation criteria are available on the companion Web site. The Web site also contains additional case study materials developed with the support of NSF’s program in undergraduate education (NSF DUE-0088396).

Occasional homework or in-class exercises provide a more focused complement to the semester-long projects. The exercises at the end of each chapter are designed to get students thinking about the concepts and methods presented in the chapter. They can be used either for in-class group work and discussion or as homework problems. Sample answers will be made available to instructors via the book Web site.

In addition to a complete list of references, there is a glossary and an appendix that students and instructors may find useful. The glossary defines key HCI concepts. These terms are defined in context as well; each glossary term appears in bold when it is defined in context. There is also an appendix containing a brief introduction to the inferential statistics used in traditional behavioral experiments. These methods are rare in usability engineering practice, so we elected to leave them out of the main textbook content. However, instructors who require their students to carry out formal usability studies should find this material useful.

Acknowledgements

Many people have contributed to the development of this book. We have been much influenced by the other textbooks we have used in our classes, particularly the books by Preece et al. (1994) and by Shneiderman (1998). We have also benefited from our colleague Rex Hartson, who has spent years developing a usability engineering course at the graduate level. His project-based approach to teaching HCI had a formative effect on Rosson’s undergraduate course and subsequently this book.

Some of our time developing the book was spent on sabbatical at the Xerox Research Centre in Cambridge, England. We thank our colleagues there for their support, particularly Allan MacLean, who hosted our visit. While there, we had the good fortune to be located near a group of researchers working on mobile and wireless computing, and we thank Mik Lamming, Marge Eldridge, and Mike Flynn, for introducing us to many of the current issues in this area of HCI.

Our development of MOOsburg, the virtual science fair case study, and of scenario-based methods in general has been supported by a number of agencies. We are grateful to NSF (REC-9554206), the Hitachi Foundation, and the Office of Naval Research (N00014-00-1-0549). Many research associates and students have contributed to these projects over the last few years: Philip Isenhour developed CORK, the software architecture that is used in MOOsburg, as well as many of basic MOOsburg tools; Wendy Schafer developed the map navigation tool; Stuart Laughton, George Chin, Jurgen Koenemann, Dennis Neale, and Dan Dunlap helped us refine our participatory and scenario-based methods; and Christina van Metre and Robert Zinger contributed to some of the early analysis and design work on the virtual science fair. Jennifer Thompson, Wes Lloyd, Vinoth Jagannathan, and Jiunwei Chen worked on the Web-based case study library and browsing tools.

We would also like to thank our many reviewers for their extensive and very constructive comments. We are particularly indebted to John Bennett, Andrew Dillon, Sidney Fels, Doug Gillan, Jeff Johnson, John Meads, Kevin Mullet, and Bonnie Nardi. And of course, we have benefited throughout from the guidance and encouragement of our editors at Morgan Kaufmann: Diane Cerra, Belinda Breyer, and Howard Severson. Lastly we thank our daughter Erin, who has stepped up to many new responsibilities while Mom and Dad were busy working, and our dog Kerby, who in classic yellow lab fashion has happily ignored and energetically diverted much of the stress that comes with a writing project such as this.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.58.199