0%

Book Description

Human error is cited over and over as a cause of incidents and accidents. The result is a widespread perception of a 'human error problem', and solutions are thought to lie in changing the people or their role in the system. For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label 'human error' is prejudicial and hides much more than it reveals about how a system functions or malfunctions. This book takes you behind the human error label. Divided into five parts, it begins by summarising the most significant research results. Part 2 explores how systems thinking has radically changed our understanding of how accidents occur. Part 3 explains the role of cognitive system factors - bringing knowledge to bear, changing mindset as situations and priorities change, and managing goal conflicts - in operating safely at the sharp end of systems. Part 4 studies how the clumsy use of computer technology can increase the potential for erroneous actions and assessments in many different fields of practice. And Part 5 tells how the hindsight bias always enters into attributions of error, so that what we label human error actually is the result of a social and psychological judgment process by stakeholders in the system in question to focus on only a facet of a set of interacting contributors. If you think you have a human error problem, recognize that the label itself is no explanation and no guide to countermeasures. The potential for constructive change, for progress on safety, lies behind the human error label.

Table of Contents

  1. Cover Page
  2. Title Page
  3. Halfitle Page
  4. Copyright Page
  5. Contents
  6. List of Figures
  7. List of Tables
  8. Acknowledgments
  9. Reviews for Behind Human Error, Second Edition
  10. About The Authors
  11. Preface
  12. Part I An Introduction to The Second Story
  13. 1 The Problem with “Human Error”
  14. 2 Basic Premises
  15. Part II Complex Systems Failure
  16. 3 Linear and Latent Failure Models
  17. 4 Complexity,Control and Sociological Models
  18. 5 Resilience engineering
  19. Part III Operating at The Sharp End
  20. 6 Bringing knowledge to bear in context
  21. 7 Mindset
  22. 8 Goal conflicts
  23. Part IV How Design Can Induce Error
  24. 9 Clumsy use of technology
  25. 10 How computer-based artifacts shape cognition and collaboration
  26. 11 Mode error in supervisory control
  27. 12 How practitioners adapt to clumsy technology
  28. Part V Reactions to Failure
  29. 13 Hindsight bias
  30. 14 Error as information
  31. 15 Balancing accountability and learning
  32. 16 Summing up: how to go behind the label “human error”
  33. References
  34. Index
3.145.178.240