Feeds:
Posts
Comments

Posts Tagged ‘airline’

According to planecrashinfo.com, the largest single cause of airline fatalities is “pilot error,” at 45%.  Studies by Boeing have pegged the number as high as 70%.  All other causes make up slivers of the remaining proverbial pie chart.  The conclusion to be drawn is that the single weakest link in any complex system we rely on daily, is the human link.  Dennis Quaid recently experienced human error at the renowned Cedars-Sinai Medical Center, when his newborn twins were accidentally given 1000x more Heparin, a blood-thinner, than they were supposed to receive.  The Health Populi blog shows a picture of Heparin vials with varying concentrations — they are color-coded by dose concentration and clearly marked, however medical staff at Cedars-Sinai somehow grabbed the wrong vials.

The Quaid twins are going to be fine (although they could have bled to death)…and the purpose of this post isn’t to point the finger at Cedars-Sinai Hospital.  As hospitals go, Cedars-Sinai is probably a pretty good place to get sick – it was ranked “best of the best” by U.S. News & World Report, and all those Hollywood types seem to like to die there.  The purpose is to use the Quaid twins to talk about something I’m willing to bet happens all the time.

After all, we’re human.  The person that grabbed the Heparin vial may have misread the label, or perhaps Heparin changed its color-coding system.  More likely a 10,000 U/ml vial found its way onto the same shelf as a 10 U/ml vial…and the label never even came into play.  Perhaps a new person was stocking the cabinet and didn’t think to sort the Heparin by color — having no idea the person retrieving the vials would assume the dosage by location.

Chaos Theory states that non-linear systems appear to produce random results, but actually are predetermined entirely by their initial conditions.  Put another way, if you subject the exact same inputs to a system, you will get the exact same result.  But a tiny variation on any of the inputs could produce dramatically different results.  It might be safe to say that the farther away the input is, temporally, from the end of the system, the more dramatic the change in output.  And its not arithmetic, it’s logarithmic.  The conclusion, if you buy this chaos theory crap, is that you want as little variability in input as possible in a critical system in which lives are dependant.  And because the least precise input is most assuredly a human one, you probably want to reduce human intervention as much as possible — especially if your job is running a hospital.

Sitting here typing this post, my spell-check is telling me I’ve made at least half a dozen spelling errors, and I’ve made countless grammatical errors…I’m sure.  I’ll try to proofread it, and maybe I’ll eliminate 80% of the errors, but my fallible eyes will inevitably miss a few things.  If I was a doctor writing down consultation notes, writing down a prescription…anything where I could misplace a decimal, I’d be acutely aware of the possibility of making errors (and the certainty of many errors throughout my career), but unaware of where many of my errors would be.  And maybe my office has a system of checks and balances built in to catch these errors…and for the most part it works, except when my “checker” misses one of my errors, or more likely, catches my error but assumes a la Authority Bias (I’m the one in the white coat and the letters “MD” after my name) that I meant to do it.

A close friend of mine was a recent near-casualty of such an error.  He had a prescription filled at a new pharmacy, and was given a generic form of (coincidentally enough) an anti-coagulant in a different dose, but same size pill.  Nobody informed him that the new pills, while looking exactly the same, were twice as potent.  Like the Quaid twins, he was fortunate to get medical treatment, just as his kidneys started to bleed.  He could have died.  And if he had, a single break in the chain of communication somewhere between doctor, nurse, pharmacist, and patient would’ve been to blame.
We learn new things everyday about how to improve human perception and attention, how to minimize human err…but the human mind will always have biases that will cause us to make mistakes.  One of the keys to reducing patient deaths is to reduce or eliminate human control…when practical, of course.  A study was done recently (I can’t find a citation) where ER doctors were more accurately able to diagnose heart attack patients by asking three fixed questions and plugging the results into a research-developed computer program, then they were using their own judgement.  Even experienced doctors that used their own instincts were significantly less accurate.

We need more innovations in health care administration to reduce “preventable errors.”  With increasingly complex health systems (literally dozens of parties touching every health insurance claim), we need increasingly simple ways for humans to interact with those systems.  Let’s start small – how about Google or Cerner developing standardized electronic medical record keeping systems EVERYONE can use.

Read Full Post »