-“Complacency is the main cause of accidents.” – said Tracy during a leadership briefing on how accidents happen. “But what is complacency?” – she asked the team.
-“Workers accept risks they shouldn’t accept and become numb to hazards” – said regional manager.
-“They are satisfied with own performance, confident they can do the job but lose chronic unease” – added another person.
“It’s a silent killer. When people are not paying attention, their mind is not on task and puts in them in the line-of-fire. You get on autopilot and you get killed.” – concluded operations director.
“That’s right. Can you think about one single situation when you got hurt when you were thinking about what you were doing?” – Tracey asked the audience again – “Probably not. Paying attention is the best solution to prevent accidents” – summarised Tracy truly believing that maintaining conscious attention to everything in your surroundings is realistic, but people just don’t try hard enough.
“The term ‘complacency’ originated in the aviation community when pilots, or air traffic controllers did not conduct sufficient checks and assumed “all was well” when in fact a dangerous condition was developing that led to the accident” and this term was even included in the NASA’s accident reporting system in 1970s – says human factors professor Raja Parasuraman from George Mason University [1].
However, since then, the aviation industry has recognised that using this term is rather problematic. EUROCONTROL [2] – an international organisation working to achieve safe and seamless air traffic management in Europe – which also advises airlines on Human Performance matters, says that the term “complacency”:
- Lacks a common definition and is used mainly by non-experts to label a broad range of issues such as: overconfidence, self-satisfaction, reduced awareness of danger, confidence plus contentment, low index of suspicion, loss of situation awareness, or unpreparedness to react in timely manner when system fails. It’s used to refer to individual behaviour as well as whole organisations.
- Is difficult to prove right or wrong because it’s so vague.
- Easily leads to overgeneralisations describing many different psychological and social mechanisms.
- Is used as a label replacing other labels such as “human error” or “loss of situational awareness”, but still does not explain anything.
- Is judgemental and indicates fault of character rather than situational contributors, preventing effective learning.
Despite those challenges Professor Parasuraman managed to extract some common characteristics of “complacency”:
- A human operator is monitoring something, for example: a control panel on a rig, or hand positioning during a maintenance task.
- The frequency of monitoring is lower than some optimal value.
- There is an undesired outcome.
But there are problems even with this description. To claim somebody was complacent, we need to show this person was monitoring less often than some optimal value, taking into account all other things they had to pay attention to at the time. But what is this optimal value, who should determine it and how?
“Complacency” is also only used when something goes wrong. After an event, it’s easy to say in hindsight, “if only you paid attention to X” (see the article explaining how hindsight bias distorts your memory and judgement LINK: https://bit.ly/3bSH3IG ).
Leaders would like workers to be aware of `anything in the environment which might be of importance if it changed unexpectedly’. But how realistic is this wish?
Let’s start with the drillers on drilling rigs. They have to deal with hundreds of data points from multiple computer displays, analogue gauges, and other team members who use many types of data sources. The number of things that can go wrong, and combinations in which they can go wrong, exceeds human capacity to analyse on the spot.
“What appears to be ‘complacency’ may be a rational strategy” – another study says [3]. When a driller has to monitor (i.e. look at) and comprehend a large number of data sources, it is rational to reduce monitoring of highly reliable sources and focus on those which are more likely to require an intervention. In fact, monitoring reliable sources may be increasing the likelihood of failure, as it takes time and attention away from the sources that are more likely to be problematic, making it more probable the driller would miss a critical piece of information from the unreliable source.
We also have to distinguish between monitoring, and diagnosis. Imagine a driller has 20 gauges to pay attention to. Let’s say we want them to check them one by one in a loop, looking at each for 30 seconds. This is monitoring frequency.
However, even if the driller follows the monitoring frequency, it doesn’t mean they would be able to instantly diagnose a problem. If one of the gauges is approaching an upper limit values and infrequently exceeds it, should the driller shut the well? There is always a level of uncertainty and the driller will try to make sense of the data:
- It may be a sensor issue
- It may be software issue
- It may be geological anomaly that is not relevant
- It may indicate a problem only if other parameters change
- The well starts flowing and requires immediate intervention.
The driller will do their best to interpret the data in the context of their situation. For example, if there is a history of faulty sensors, the data cannot be trusted and it’s more likely to be interpreted with caution.
Professor Erik Hollnagel adds the following: “People are greatly helped by the regularity or stability of their work environment and, indeed, the regularity of the world at large. If work environments were continually changing they would lack the predictability that makes it possible to take shortcuts and learn how things can be done in a more efficient manner. Yet it is precisely because work environments – by design or by nature – have some measure of regularity or stability that they are predictable, and therefore allow performance to be optimised.
Instead of checking every possible condition, efforts can be saved to check conditions that are known to vary across situations, or conditions that are seen as being more important.
Human performance is efficient because people quickly learn to disregard those aspects or conditions that normally are insignificant and it is the norm rather than the exception” [4].
The problem of people not paying attention is not new at all. The first attempts to understand why people were not paying attention can be traced back to military research during WW2. Researchers were trying to determine why radar operators were missing signals of enemy submarines [5]. This and many subsequent studies showed that paying attention declines rapidly during the first 15-30 minutes of the monitoring task. Why?
Mental work requires physical energy. “The human brain is a physical organ and like all other organs, it requires energy to perform work. On a second by second basis, the human brain uses more energy at rest than a human thigh during a marathon [6]. While only accounting for 2% of total body mass, the adult human brain uses roughly 20% of daily caloric intake. That intake comes only from one source: glucose – a simple sugar [7] which must be constantly supplied from the blood” – say researchers from Universities of Alberta (Canada) and Albany (USA).
Paying conscious attention increases glucose metabolism and your ability to pay attention gets worse with glucose depletion. To optimise the use of this limited energy, our attention, which may be compared to a spotlight in the darkness, continually adapts by focusing on what is important, filters out what is not relevant, and automates often repeated behaviour (you may hear people saying they were on autopilot).
Figure 1. Spotlight metaphor describes the narrow focus of our attention at any given moment
This ability to filter out repetitive stimuli, is called habituation and it’s deeply ingrained into our biology. It was observed in all organisms, from amebae to meerkats, chimpanzees, to human infants and adults [8][9]. You experience it on regular basis.
Imagine you are staying at a hotel next to a busy train line. Every time a train passes, you can hear it loudly and you can feel slight vibration of the building. The first time you heard it, you got startled, stopped what you were doing and listened attentively picking up lots of details (high use of glucose), but as you started unpacking and getting ready to bed you realise you stopped noticing this nuisance. In other words, “you got used to it” – a common phrase describing habituation.
What would happen if your body did not adapt, and gave you a startling reaction to the trains passing ten times an hour? What would be the impact on your tiredness, ability to focus on other things you need to do and your sleep?
You would not be able to do anything else as your attention would be constantly drawn to that particular event. It would paralyze your ability to function normally and exhaust you. In fact, inability to filter out stimuli is a symptom in a number of mental health disorders [10].
Habituation is a well understood effect. If we see or hear the same signal again and again but nothing happens to us in result, we will stop noticing it [11].
One study looked into brain activity to see what happens when people were shown repeatedly the same computer security warnings. The neural activity sharply decreased very quickly [12]. Habituation makes warnings once very visible, virtually unnoticeable.
Let’s consider another example: transferring fuel from a track to a tank. If it goes wrong, there is potential for a spill and fire. The driver uses a hose with an automatic switch-off valve. The job is to connect the hose, start the flow and wait.
The driver is expected to stand there for 1 hour and watch the gauge and valve intently. The process is seamless, the gauge shows the same value all the time and the switch-off valve works reliably every time. What do you think is really likely to happen?
If the process is highly reliable, and the driver has multiple other tasks, such as filling out paperwork, plan the next delivery, check other things in the car, what is the rational thing to do from the point of the driver to make the best use of their time and energy? When suddenly something goes wrong, it’s easy for an external observer to say “you should have done X” – see hindsight again.
As Daniel Kahneman, a professor emeritus of psychology at Princeton University and a Nobel Prize winner for his work on behavioural economics states in his book “Thinking Fast and Slow”: “As you become skilled in a task, its demand for energy diminishes. A general “law of least effort” applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action. Effort is a cost.”[13]. Paying conscious attention is effortful and costs energy (glucose).
Interestingly, experience and practice do not mitigate this effect [1]. If you assign training as a corrective action for habituation, you will not prevent the problem occurring again.
So what can we do about it?
Habituation takes place when a person is exposed to the same stimulus over time. This is unavoidable, should be expected and managed accordingly. Remainders to stay vigilant are simply not effective. We need to change the task, rather than reprimanding the person.
We can disrupt habituation by changing the order and format of the signal. For example, computer users who received different forms of warnings not to open email attachments (e.g. different size, fonts and colours), complied with the warnings more often than the group that received exactly the same warning repeatedly [15].
Studies showed that the following solutions reduce or prevent habituation [16]:
- Double the intensity of the signal compared to background noise (10 fold improvement)
- Increase alarm duration to 2-3 seconds (almost perfect detection)
- Reduce the number of signals
- Provide regular feedback on how well the person is focusing on key parameters
- Show examples of the signal to be detected
- Train to recognise the signals, e.g. simulations
- Introduce job rotations
- Provide regular breaks of around 5 minutes
- Have a conversation to cover the following topics
- What are they paying attention to, how often and in what order?
- What sources of information are reliable and unreliable?
- How do they know when something goes wrong?
- Do they know what to do when something goes wrong?
Detailed engineering guidance on design of alarms and warnings is provided in the industry reports including EEMUA and ANSI/ISA [17], [18].
To summarise, claiming that people were complacent when something went wrong, is not only imprecise, but it actually prevents us from learning why it made sense to people to do what they did, what they were actually paying attention to and how it was optimal in their context.
Professor Dekker says: “People will miss things. This happens because there is always more to pay attention to, to look at, to consider and limited cognitive resources to do so. Using “complacency” is a managerial cop-out. Guess who is being ‘complacent’ when you use the term? It is saying human error was the cause of trouble instead of seeing it as a result of something else.
If you notice behaviour you would like to call “complacent” ask yourself what that is the effect, symptom or result of?” [19].
Bibliography
[1] R. Parasuraman and D. H. Manzey, “Complacency and bias in human use of automation: An attentional integration,” Hum. Factors, vol. 52, no. 3, pp. 381–410, 2010.
[2] EUROCONTROL, “Complacency,” 2016. [Online]. Available: https://www.skybrary.aero/index.php/Complacency.
[3] N. Moray, “The role of attention in the detection of errors and the diagnosis of failures in man-machine systems,” in Human Detection and Diagnosis of System Failures, J. Rasmussen and W. Rouse, Eds. Plenum Press, 1981.
[4] E. Hollnagel, The ETTO Principle: Efficiency-Thoroughness Trade-Off: Why Things That Go Right Sometimes Go Wrong. CRC Press, 2017.
[5] N. H. Mackworth, “The breakdown of vigilance during prolonged visual search,” Q. J. Exp. Psychol., vol. 1, no. 1, pp. 6–21, 1948.
[6] P. W. Hochachka, Muscles as Molecular and Metabolic Machines. CRC Press, 2019.
[7] B. C. Ampel, M. Muraven, and E. C. McNay, “Mental work requires physical energy: self-control is neither exception nor exceptional,” Front. Psychol., vol. 9, p. 1005, 2018.
[8] R. F. Thompson, “Habituation: a history,” Neurobiol. Learn. Mem., vol. 92, no. 2, p. 127, 2009.
[9] J. Colombo and D. W. Mitchell, “Infant visual habituation,” Neurobiol. Learn. Mem., vol. 92, no. 2, pp. 225–234, 2009.
[10] T. A. McDiarmid, A. C. Bernardos, and C. H. Rankin, “Habituation is altered in neuropsychiatric disorders—A comprehensive review with recommendations for experimental design and analysis,” Neurosci. Biobehav. Rev., vol. 80, pp. 286–305, 2017.
[11] C. H. Rankin et al., “Habituation revisited: an updated and revised description of the behavioral characteristics of habituation,” Neurobiol. Learn. Mem., vol. 92, no. 2, pp. 135–138, 2009.
[12] B. Anderson, T. Vance, B. Kirwan, D. Eargle, and S. Howard, “Users aren’t (necessarily) lazy: Using neurois to explain habituation to security warnings,” 2014.
[13] D. Kahneman, Thinking, Fast and Slow. Penguin Books, 2012.
[14] N. Bagheri and G. A. Jamieson, “Considering subjective trust and monitoring behavior in assessing automation-induced ‘complacency.,’” Hum. performance, Situat. awareness, Autom. Curr. Res. trends, pp. 54–59, 2004.
[15] J. C. Brustoloni and R. Villamarín-Salomón, “Improving security decisions with polymorphic and audited dialogs,” in Proceedings of the 3rd symposium on Usable privacy and security, 2007, pp. 76–85.
[16] J. S. Warm, Sustained Attention in Human Performance. Wiley, 1984.
[17] EEMUA, “Alarm systems – a guide to design, management and procurement,” 2013.
[18] ANSI/ISA, “ANSI/ISA-18.2-2016, Management of Alarm Systems for the Process Industries,” 2016.
[19] S. Dekker, The Field Guide to Understanding Human Error: Second Edtion. Ashgate Publishing, Limited, 2013.