Trying to stop human errors happening in healthcare is futile, but hospitals can successfully design safer systems by adopting a human factors approach, international experts told a HARC roundtable this month.
Dr Terry Fairbanks, an emergency physician and human factors specialist from MedStar Health and Georgetown University in the US, and Dr Ken Catchpole, Director of Surgical Safety and Human Factors Research at Cedars-Sinai Medical Centre in LA, joined with local human factors expert Dr Thomas Loveday from the Clinical Excellence Commission (CEC) for the roundtable held at the Sax Institute.
From aviation to healthcare
Dr Fairbanks explained that human factors engineering was an alternative way of addressing safety, adopted broadly by the aviation industry. Human error was as unavoidable in healthcare as in any other industry, he said, so it was important to design a system that took into account human strengths and weaknesses ‒ or human factors.
“We don’t redesign humans. We redesign the system within which humans work,” Dr Fairbanks said. “The goal isn’t to eliminate human error, but to understand why errors occur and to reduce the chance of that happening or to mitigate the effect.”
Strategies such as policies, training, discipline and vigilance had been shown to have little impact on the rate of errors, because no matter how well-intentioned healthcare professionals were, human error was inevitable, he said
Instead, he suggested the systems surrounding how doctors and nurses work could be made safer.
He gave the example of a US hospital emergency department that sought to change the way nausea was managed, to ensure the two safest drugs were used. Using a human factors approach, the five nausea drugs that carried a higher risk of side effects were taken out of the drug dispensing machine. Doctors could still order those five drugs from the hospital pharmacy, but it took longer to access them.
“There was no policy or guideline, but everyone started using the other two drugs,” he said. “Policy is not always the answer.”
Dr Ken Catchpole, a research psychologist and human factors practitioner, said healthcare was about 30 years behind the aviation industry in adopting the human factors approach to reducing error, but “it’s started”.
There were a number of dimensions to achieving a behavioural change, he said, including simple steps such as looking at whether the order in which people did things was effective. He cited the example of automatic teller machines, which do not dispense cash until the user has retrieved their card, reducing the risk of the card being left behind.
Technology could also enable or disable people’s ability to do their jobs. For example, some pieces of medical equipment had on/off buttons located in a spot where it was easy to accidentally switch them off, risking a patient’s life.
The organisation and environmental levels also needed to be evaluated, often resulting in changes like marking an area on the floor that needs to remain clear to enable safe patient flow, Dr Catchpole suggested.
“Changes to one, or all of those dimensions, can make a big difference,” he said.
Dr Thomas Loveday, a human factors engineer and psychologist, joined the CEC just over a year ago in the wake of research conducted by HARC scholar Bronwyn Shumack into human factors and their impact in healthcare.
He said the design of new e-health systems was among the areas he was focusing on, along with user-centred design of medical equipment and diagnostic error and system factors as contributors to sub-optimal decision-making in healthcare.
“We want to support Local Health Districts and help them to work better and safer,” he said.
Dr Loveday said he was also applying a human factors approach to reducing the incidence of worker injury in the health system.