BBC, 26 July 2015
Bosses in all fields can make mistakes. And while junior staff may always feel uncomfortable pointing them out, in some areas failing to do so could cost lives.
Aviation and medicine are two professions where the hierarchy that exists can make it particularly difficult for those lower down the pecking order to speak out.
One of the ways airlines are trying to reduce potentially fatal errors occurring is to use psychological techniques to break down that hierarchical structure and encourage people at all levels to highlight if something is about to go wrong–and medicine is starting to follow suit.
The aviation industry has embraced what’s known as a “just” culture, where reporting errors is encouraged to prevent mistakes turning into tragedies.
This approach followed disasters like that in Tenerife where on 27 March 1977 when 583 people died after two planes collided on the ground and burst into flames.
There was nothing technically wrong with either plane, and the main reason behind the crash was found to be the “authority gradient” in the cockpit of one plane.
The captain had overruled the co-pilot who thought they hadn’t been cleared for take-off.
Finding it hard to speak up in front of senior colleagues–even when it’s a matter of life or death–is something that can get in the way of openly pointing out errors.
Even with teams who work very closely, like the crew on an aeroplane, junior staff have been known to keep quiet in an emergency rather than question the actions of a pilot.
Surgical teams now hope to learn from years of research in aviation psychology which have made crashes a rarity.
Matt Lindley flies jumbo jets and trains doctors in safety. He recalls a case where a surgeon was preparing to operate on a child’s hand.
A junior member of staff noticed they were about to operate on the wrong hand–but her fears were dismissed. She tried again.
He said: “It’s quite unusual, a lot of people just back down after the first time you’re not acknowledged. She was told quite bluntly to be quiet.”
The team finally realised they’d operated on the wrong hand about 10 minutes into the procedure. Afterwards the junior doctor said she felt guilty–but also that she didn’t have the skills to make herself heard.
Mr Lindley says she should have been assertive–and used certain “trigger words”.
“I am concerned. I am uncomfortable. This is unsafe. Or we need to stop. And I think no matter what position you are in the pecking order, to ignore those four trigger words would be very very difficult.”
Most doctors say they’ve had a “light bulb moment” when they finish the course that he runs on these techniques.
“Many say: why am I doing this course when I’ve been a doctor for 25 years–I should have done this on day one!”
In 2012/2013 in England there were nearly 300 “never events”–incidents which can cause serious harm or death and are wholly preventable.
Measures do exist. The WHO’s Operating Checklist provides prompts at each stage of an operation for staff to carry out important checks–including basic checks like asking a patient to confirm their date of birth.
Rhona Flin, professor of applied psychology at Aberdeen University, has spent years analysing how human error can lead to disaster.
She says: “People often think their own industries are very different. Actually if you’re a psychologist who’s worked in different industrial settings it all looks pretty much the same to me.
“They’re all humans working in these technical environments. They’re affected by the same kind of emotions and social factors.”
Prof Flin says deference to authority can get in the way of open, honest reporting of errors and that at the time of the Tenerife disaster psychologists who observed crews training in flight simulators were alarmed by what they saw.
“Captains were briefed in advance to take some bad decisions or feign incapacity–to measure how long it would take for co-pilots would take to speak up. One psychologist monitoring their responses commented ‘Co-pilots would rather die than contradict a captain’.”
Mr Frank Cross is a vascular surgeon who works in London. He remembers vividly a mistake he made 30 years ago–leaving a swab behind in a patient’s body during an operation on her bowel.
When the patient came back complaining of a lump in her abdomen a few months later the swab was detected and removed.
He says it’s always better to own up, “You need to be open and honest if you make a mistake, and show that you are sorry.”
Bosses in all fields can make mistakes. And while junior staff may always feel uncomfortable pointing them out, in some areas failing to do so could cost lives.
Aviation and medicine are two professions where the hierarchy that exists can make it particularly difficult for those lower down the pecking order to speak out.
One of the ways airlines are trying to reduce potentially fatal errors occurring is to use psychological techniques to break down that hierarchical structure and encourage people at all levels to highlight if something is about to go wrong–and medicine is starting to follow suit.
The aviation industry has embraced what’s known as a “just” culture, where reporting errors is encouraged to prevent mistakes turning into tragedies.
This approach followed disasters like that in Tenerife where on 27 March 1977 when 583 people died after two planes collided on the ground and burst into flames.
There was nothing technically wrong with either plane, and the main reason behind the crash was found to be the “authority gradient” in the cockpit of one plane.
The captain had overruled the co-pilot who thought they hadn’t been cleared for take-off.
Finding it hard to speak up in front of senior colleagues–even when it’s a matter of life or death–is something that can get in the way of openly pointing out errors.
Even with teams who work very closely, like the crew on an aeroplane, junior staff have been known to keep quiet in an emergency rather than question the actions of a pilot.
Surgical teams now hope to learn from years of research in aviation psychology which have made crashes a rarity.
Matt Lindley flies jumbo jets and trains doctors in safety. He recalls a case where a surgeon was preparing to operate on a child’s hand.
A junior member of staff noticed they were about to operate on the wrong hand–but her fears were dismissed. She tried again.
He said: “It’s quite unusual, a lot of people just back down after the first time you’re not acknowledged. She was told quite bluntly to be quiet.”
The team finally realised they’d operated on the wrong hand about 10 minutes into the procedure. Afterwards the junior doctor said she felt guilty–but also that she didn’t have the skills to make herself heard.
Mr Lindley says she should have been assertive–and used certain “trigger words”.
“I am concerned. I am uncomfortable. This is unsafe. Or we need to stop. And I think no matter what position you are in the pecking order, to ignore those four trigger words would be very very difficult.”
Most doctors say they’ve had a “light bulb moment” when they finish the course that he runs on these techniques.
“Many say: why am I doing this course when I’ve been a doctor for 25 years–I should have done this on day one!”
In 2012/2013 in England there were nearly 300 “never events”–incidents which can cause serious harm or death and are wholly preventable.
Measures do exist. The WHO’s Operating Checklist provides prompts at each stage of an operation for staff to carry out important checks–including basic checks like asking a patient to confirm their date of birth.
Rhona Flin, professor of applied psychology at Aberdeen University, has spent years analysing how human error can lead to disaster.
She says: “People often think their own industries are very different. Actually if you’re a psychologist who’s worked in different industrial settings it all looks pretty much the same to me.
“They’re all humans working in these technical environments. They’re affected by the same kind of emotions and social factors.”
Prof Flin says deference to authority can get in the way of open, honest reporting of errors and that at the time of the Tenerife disaster psychologists who observed crews training in flight simulators were alarmed by what they saw.
“Captains were briefed in advance to take some bad decisions or feign incapacity–to measure how long it would take for co-pilots would take to speak up. One psychologist monitoring their responses commented ‘Co-pilots would rather die than contradict a captain’.”
Mr Frank Cross is a vascular surgeon who works in London. He remembers vividly a mistake he made 30 years ago–leaving a swab behind in a patient’s body during an operation on her bowel.
When the patient came back complaining of a lump in her abdomen a few months later the swab was detected and removed.
He says it’s always better to own up, “You need to be open and honest if you make a mistake, and show that you are sorry.”
0 Comments:
Post a Comment