PART IV
HOW DESIGN CAN INDUCE ERROR

This part of the book describes several classic deficiencies in computerized devices and how these negatively influence practitioner cognition and collaboration. Characteristics of computerized devices that shape cognition and collaboration in ways that increase the potential for error are one type of problem that can contribute to incidents. The presence of these characteristics, in effect, represents a failure of design in terms of operability (i.e., a kind of design “error”). We will show why these device characteristics are deficiencies, and we will show how the failure to design for effective human-computer cooperation increases the risk of bad outcomes.

The first chapter of this part deals with what we’ve called clumsy automation and one of its results: automation surprises. Automation surprises are situations where crews are surprised by actions taken (or not taken) by the automatic system, and we have examples from both advanced flight decks and the operating theater. Automation surprises begin with miscommunication and misassessments between the automation and users, which lead to a gap between the user’s understanding of what the automated systems are set up to do, what they are doing, and what they are going to do. The initial trigger for such a mismatch can arise from several sources like erroneous inputs such as mode errors or indirect mode changes where the system autonomously changes its status and behavior based on its interpretation of pilot inputs, its internal logic or sensed environmental conditions. The gap results in people being surprised when the system behavior does not match their expectations. This can lead to detrimental consequences in safety-critical settings.

The second chapter of this part attempts to map in more detail how computer-based artifacts shape people’s cognition and collaboration. How a problem is represented influences the cognitive work needed to solve that problem, which either improves or degrades performance. The chapter traces how technology impacts people’s cognition; how cognition impacts people’s behavior in operational setting, and how such behavior can contribute to an incident’s evolution. Computers have a huge impact here. A fundamental property, after all, of the computer as a medium for representation is freedom from the physical constraints acting on the real-world objects/systems that the representation (i.e. the things on a computer screen) refers to. Such virtuality carries a number of penalties, including people getting lost in display page architectures, and the hiding of interesting changes, events and system behaviors.

The third chapter is dedicated to one of the most vexing problems in human-computer interaction: mode errors. These occur when an intention is executed in a way appropriate for one mode when, in fact, the system is in a different mode. The complexity of modes, interactions across modes, and indirect mode changes create new paths for errors and failures. No longer are modes only selected and activated through deliberate explicit actions. Rather, modes can change as a side effect of other practitioner actions or inputs depending on the system status at the time. The active mode that results may be inappropriate for the context, but detection and recovery can be very difficult in part due to long time-constant feedback loops.

This chapter also discusses one consequence of mode confusion: the subsequent going-sour scenario that seems basic to a number of incidents in complex, highly computerized systems. In this scenario, minor disturbances, misactions, miscommunications and miscoordinations collectively manage a system into hazard despite multiple opportunities (certainly in hindsight) to detect that the situation is headed for a negative outcome.

The final chapter of this part considers the research results on how people adapt to new technology. We identify several types of practitioner adaptation to the impact of new information technology. In system tailoring, practitioners adapt the device and context of activity to preserve existing strategies used to carry out tasks (e.g., adaptation focuses on the set-up of the device, device configuration, how the device is situated in the larger context). In task tailoring, practitioners adapt their strategies, especially cognitive and collaborative strategies, for carrying out tasks to accommodate constraints imposed by the new technology. User adaptations (or user tailoring) can sometimes be brittle – working well in a narrow range of routine situations, but quite vulnerable when operational conditions push the user off a familiar pathway. The part finishes with a discussion of use-centered design.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.225.55.151