Jan. 28, 1986. Do you remember that date?
Here is a hint: Seven people died that day. Well, more than seven people died that day throughout the world, but seven very specific people died that day. It was the day the space shuttle Challenger exploded and its seven-member crew perished. It was 28 years ago, yet we still remember.
The failure centered on a problem with the O-rings. We know that because, after the accident, a presidential commission was formed to investigate what went wrong. It was a national priority to understand the cause of the explosion to prevent it from happening again.
In contrast, health care has a safety problem that causes the loss of 200 or more lives every day. Is it a national priority to understand and fix this? Do we understand why so many lives are lost? Granted, health care as a whole presents an almost impossibly complex system and perhaps a comparison with the Challenger disaster is a bit unfair, but the point stands. We need to do more to understand why we have so many preventable deaths.
To that end, ECRI Institute recently published the "Top 10 Patient Safety Concerns for Healthcare" based on our own accident investigations and adverse-event data analysis of more than 300,000 events. We have not found a single root cause of health care patient safety mishaps, but we have identified specific issues to share in the hope that solutions will emerge as we learn together. The top three concerns are: data integrity failures with health information technology systems; poor care coordination with the patient's next level of care; and test results reporting errors.
ECRI Institute has found that the integrity of data in HIT systems can be compromised in various ways, including: data entry errors; missing data; delayed data delivery; inappropriate use of default values; copying and pasting outdated data into a new report; use of both paper and digital systems for patient care; and patient and data association errors, which occur when patient data from a medical device is mistakenly associated with a different patient's record.
In a sense — because we are all patients at some point — we are participating in a grand experiment of rapid digitization of health information. While this digitization is overdue, it has been happening relatively rapidly because of federal incentives.
No technology is perfect and neither are the humans who use it. As a result, we are increasingly learning of patient safety risks from adverse events. For example, computerized provider order entry systems help to eliminate errors from poor handwriting and add the ability to alert clinicians to potentially dangerous situations such as drug allergies. In contrast, a CPOE system potentially could harm a patient if a clinician selects the wrong patient name or drug name from a dropdown menu. In this case, the actual design of the user interface or the rushed nature of clinical work could lead to a patient's receiving the wrong drug.
In one report we received, a child's weight on admission was entered as 40 kg instead of 40 lbs. A medication dosage then was calculated based on the wrong weight. Fortunately, the pharmacist caught the error — otherwise, the child would have received a dosage intended for someone more than twice his or her weight.
Given the size of our health care system and the number of instances data are entered throughout all of our provider institutions, it is likely that similar mistakes are happening every day.
Lapses in Communication
Our second concern centers on risks associated with poor care coordination. We have read of many instances of communication gaps occurring between and among providers as patients are moved from one care provider to another.
According to one report: "An infant who died from sudden infant death syndrome previously had been seen in the hospital for a life-threatening event. Because of abnormal findings on the patient's CT scan, the patient's discharge summary indicated the patient should have an MRI exam. The discharge summary was not sent to the patient's primary care physician. The patient did not undergo the MRI study."
Bridging such communication gaps will require technological solutions as well as better processes to ensure that the right information does not just leave with the patient but arrives accurately at the next provider. Otherwise, lost information becomes lost lives.
Similar to poor care coordination, our third concern, test results reporting errors, is among the top of the list of issues in an analysis of laboratory testing processes that ECRI Institute conducted. Among the reports we received was an instance in which an infant did not get antibiotic treatment in a timely manner because test results confirming an infection were not reported promptly to the ordering clinician. We also received a report that a patient's seizure due to a low sodium level could have been avoided if blood chemistry results had been provided in a timely fashion.
Delays in test results occur for many reasons, but usually seem to arise from one or a combination of issues such as inadequate staff monitoring of expected results, poor electronic interfaces between providers or communication lapses when one clinician is preoccupied or out of touch and no one is designated as backup.
As clinicians and patients rely increasingly on technology and tests to diagnose our maladies, we need to understand that speed matters in all diagnoses, not just in heart attacks and strokes. A critical priority for patients and providers alike is to ensure that our test results move with us quickly and accurately as we give and receive care. Inadequate access to test results leads to inadequate care, which leads to patient harm.
Do We Have All the Facts?
The Rogers Commission, formed to investigate the Challenger accident, produced a report of its findings in which it noted the immediate cause of the accident: "In view of the findings, the Commission concluded that the cause of the Challenger accident was the failure of the pressure seal in the aft field joint of the right Solid Rocket Motor." The infamous O-rings are part of the design of the seal.
Despite this specific cause, they also noted a critical contributing factor: "The decision to launch the Challenger was flawed. Those who made that decision were unaware of the recent history of problems concerning the O-rings and the joint and were unaware of the initial written recommendation of the contractor advising against the launch at temperatures below 53 degrees Fahrenheit. ... If the decision-makers had known all of the facts, it is highly unlikely that they would have decided to launch [the Challenger] on Jan. 28, 1986."
There is a link between the Challenger disaster and these patient safety issues: lapses in communication and communication systems create hazards. When information does not flow freely and accurately, bad things happen; specifically, flawed communication leads to flawed decisions. Of course, accidents happen. Not everything is foreseeable. Or preventable. But some things are. Do you know which is which?
Anthony J. Montagnolo, M.S. (firstname.lastname@example.org) is executive vice president and chief operating officer of ECRI Institute, Plymouth Meeting, Pa.
Get the Report
To read ECRI Institute's "Top 10 Patient Safety Concerns for Healthcare," go to www.ecri.org/patientsafetytop10.