This article was originally published on the Croakey blog.
“Policy is not always the answer.” The Sax Institute recently hosted a roundtable to look at how human error can be reduced in healthcare, including by looking to the aviation industry and bank ATMs for guidance. Megan Howe, Publications Editor at the Sax Institute, reports on the event.
About three decades ago, the aviation industry realised human error was a bigger contributor to air accidents than mechanical failures. The realisation led it to embrace a “human factors” approach to safety, which meant looking at how humans could most safely and efficiently be integrated with the technology, and how the technology, training and procedures could be designed to help all workers, from the pilots to the maintenance crew, perform better.
It’s no different in healthcare, according to international and local human factors experts who shared their insights with members of the Hospital Alliance Research Collaboration at a roundtable meeting held at the Sax Institute last month.
Dr Terry Fairbanks, an emergency physician and human factors specialist from MedStar Health and Georgetown University in the United States, said trying to stop human errors happening in healthcare was futile, but hospitals could successfully design safer systems by adopting a human factors approach.
From aviation to healthcare
Dr Fairbanks explained that human factors engineering was an alternative way of addressing safety, adopted broadly by the aviation industry. Human error was as unavoidable in healthcare as in any other industry, he said, so it was important to design a system that took into account human strengths and weaknesses ‒ or human factors.
“We don’t redesign humans. We redesign the system within which humans work,” Dr Fairbanks said. “The goal isn’t to eliminate human error, but to understand why errors occur and to reduce the chance of that happening or to mitigate the effect.”
Strategies such as policies, training, discipline and vigilance had been shown to have little impact on the rate of errors, because no matter how well-intentioned healthcare professionals were, human error was inevitable, he said
Instead, he suggested the systems surrounding how doctors and nurses work could be made safer.
He gave the example of a US hospital emergency department that sought to change the way nausea was managed, to ensure the two safest drugs were used. Using a human factors approach, the five nausea drugs that carried a higher risk of side effects were taken out of the drug dispensing machine. Doctors could still order those five drugs from the hospital pharmacy, but it took longer to access them.
“There was no policy or guideline, but everyone started using the other two drugs,” he said. “Policy is not always the answer.”
Dr Ken Catchpole, Director of Surgical Safety and Human Factors Research at Cedars-Sinai Medical Centre in Los Angeles , said healthcare was about 30 years behind the aviation industry in adopting the human factors approach to reducing error, but “it’s started”.
There were a number of dimensions to achieving a behavioural change, he said, including simple steps such as looking at whether the order in which people did things was effective. He cited the example of automatic teller machines, which do not dispense cash until the user has retrieved their card, reducing the risk of the card being left behind.
Technology could also enable or disable people’s ability to do their jobs. For example, some pieces of medical equipment had on/off buttons located in a spot where it was easy to accidentally switch them off, risking a patient’s life, said Dr Catchpole, a research psychologist and human factors practitioner.
The organisation and environmental levels also needed to be evaluated, often resulting in changes like marking an area on the floor that needs to remain clear to enable safe patient flow, Dr Catchpole suggested.
“Changes to one, or all of those dimensions, can make a big difference,” he said.
Local human factors expert Dr Thomas Loveday joined the Clinical Excellence Commission (CEC) just over a year ago in the wake of research conducted by HARC scholar Bronwyn Shumack into human factors and their impact in healthcare.
Dr Loveday, a human factors engineer and psychologist, said the design of new e-health systems was among the areas he was focusing on, along with user-centred design of medical equipment and diagnostic error and system factors as contributors to sub-optimal decision-making in healthcare.
“We want to support Local Health Districts and help them to work better and safer,” he said.
Dr Loveday said he was also applying a human factors approach to reducing the incidence of worker injury in the health system.