Tuesday, October 31, 2006

Pilots, Surgeons & Human Error

Over 100,000 people die per year in hospitals due to human error. People make mistakes. Tired people make mistakes. People who won't listen to valuable input make mistakes. People who don't recheck their work make mistakes.

Mistakes happen in all professions, but when the potential result of a decision is death, mistakes cannot be tolerated. Period.

Pilots know this, obviously. They care for hundreds of people each day. They have found ways to minimize mistakes and now teach doctors in hospitals what they know.

“The culture in the operating room has always been the surgeon as the captain at the controls with a crew of anesthesiologists, nurses and techs hinting at problems and hoping they will be addressed,” Dr. Smith said. “We need to change the culture so communication is more organized, regimented and collaborative, like what you find now in the cockpit of an airplane.”
This will be easier said than done. While the military churns out future pilots, with pilots used to a top-down, yet collaborative approach, doctors are lone guns in a sea of sycophants. Taking on the doctor, surgeon, or whoever is in charge earns the challenger future scorn. One doctor questioning another doctor's decisions? Hahaha! Doesn't happen, except in the most egregious cases.

Let me give just one tiny example that probably happens multiple times a day in multiple hospitals nationwide. While with my preemie sons at a well-respected Children's hospital, I saw a surgeon about to examine a patient whose abdomen was open (he was born with double organs and his abdomen never closed). No one, not the other doctors (some 3rd year residents, some fellows), not the nurses, no one was saying anything. I told the nurse, if she didn't stop him I would. The lazy doctor was stopped.

There are many things that must change to prevent these episodes. Hospitals must break down the worshipful hierarchy. Doctors must give underlings permission to contribute. Doctors and staff must be held responsible for negligence by the system. Between happy experiences and lawsuits, a lot of bad things happen that get swept under the rug. Transparency must be expected. Another example: a friend was admitted to the hospital for severe dehydration secondary to an intestinal infection. The admitting physician was nowhere to be found. Her personal physician visited once a day. Who was in charge of her discharge? What was the plan? When do you call for a specialist? The (thankfully) conscious patient didn't know what to do. This unfortunate episode happened on the weekend--when all bad things happen at hospitals. Some ONE must be responsible for the patient's care.

Technology must be embraced. Every time a patient is interacted with, a health care worker should scan a device. Recommendations, drug choices must be scanned so that the results are type face instead of hand-written. All prescriptions should be computerized so an adult dose doesn't end up in a child. Or, like my sons, two doses of a med get put in one son and the other son gets none of the med. Happened, and was very likely implicated partially in my one son's death. And no, we didn't sue. Most people don't.

A doctor's success and failure statistics must be made public. There are bad and good doctors. There should only be great and good doctors. Anything less means harm is done.

Each of these decisions strike fear into participants at every level of the system (except for the patient). Doing this will result in more mistakes being revealed. Doctors will fear (and all do) more lawsuits. Tough nuts. Change is always uncomfortable. Once the changes are made, better information will back up the doctors.

Most importantly, patients won't be dying in catastrophic numbers.

2 comments:

David said...

I agree very much with breaking the "worshipful hierarchy"--I can't think right off of another industry where class distinctions are so sharp--but it won't be easy. In an airliner cockpit, the captain has authority over the copilot, but the copilot is also a colleague who will one day probably be a captain himself, and he has had essentially the same training as the captain. The nurse and the pharmacist, however, are not typically ever going to be doctors.

David said...

Also--it's true that information systems technology needs to be used more, and more effectively, in health care, but it's also important to dangers that can lurk within useful and helpful systems.

In aviation, there have been a lot of accidents and near-accidents caused by autopilots, more specifically by misuse of autopilots. For example, there was a near-disaster in which the crew set the autopilot to "pitch hold" mode, which means basically "keep the nose a constant angle above the horizon." As ice began to accumulate on the wings, the autopilot did what it had been told, valiantly maintaining pitch while speed dropped off. Finally, speed fell to the point that there was an aerodynamic stall of the airplane, which was recovered from only with considerable difficulty. If the pilot had been hand-flying the plane, he almost certainly would have noticed the declining speed in time to avoid the stall.