Search the KHIT Blog

Wednesday, August 13, 2014

3rd leading cause of death?

Preventable medical error. From a recent congressional hearing:


apropos, see my prior post, Medical Error, Interop, and the Patient Safety-Health IT nexus. to wit:
"My recent posts have ruminated on what I see as the underappreciated necessity for focusing on the "psychosocial health" of the healthcare workforce as much as focusing on policy reform (e.g., P4P, ACOs, PCMH), and process QI tactics (e.g., Lean/PDSA, 6 Sigma, Agile), including the clinical QI Health IT-borne "predictive analytics" fruits of ""Evidence Based Medicine" (EBM) and "Comparative Effectiveness Research" (CER). Evidence of psychosocially dysfunctional healthcare organizational cultures is not difficult to find (a bit of a sad irony, actually). From the patient safety-inimical "Bully Culture" down to the "merely" enervating emotionally toxic, I place it squarely within Dr. Toussaint's "8th Waste" (misused talent)."

FROM HEALTHCARE: A BETTER WAY. THE NEW ERA OF OPPORTUNITY
John L. Haughom, MD
 
Avoidable error and harm categories
Safety experts including Lucian Leape, Robert Wachter, Peter Pronovost  and others have organized the causes of avoidable errors and harm into the following logical categories:

Medication errors. Adverse drug events (ADEs) are a significant source of patient harm. The medication delivery process is enormously complex. On the inpatient side alone, it generally represents dozens of steps, and it is only marginally less complicated in the ambulatory environment. Taken appropriately, the thousands of medications available in clinical care today offer huge advantages to patients. Still, the thousands of available drug options and their complicated interaction with human physiology and each other leads to a significant incidence of near misses (5 to 10 percent) and actual adverse drug events (5 percent) in hospitalized patients.

The incidence of ADEs is significantly higher for high-risk medications like insulin, warfarin or heparin. In addition to patient harm, the cost of preventable medication errors in hospitalized patients in the U.S. is substantial, estimated at $16.4 billion annually. In the ambulatory environment, the incidence of harm and the costs are even higher.

Multiple solutions are required to address the issue of adverse drug events. These include several well-implemented technological solutions: computerized physician order entry (CPOE), computerized decision support, bar code medication administration, and radio-frequency identification (RFID) systems. It will also require addressing a number of process issues, including standardization, vigilance with respect to the “Five Rights” (right patient, right route, right dose, right time and right drug), double checks, preventing interruptions and distractions, removal of high-risk medications from certain areas, optimizing the role of clinical pharmacists, addressing the issue of look-alike and soundalike medications, and implementing effective medication reconciliation processes, particularly at hand-off points.


Surgical errors. There are over 20 million surgeries annually in the U.S. In recent years, a number of advances have resulted in significant improvements in the safety of surgery and anesthesia and reductions in harm and death. Still, a number of surgical safety challenges persist. These include persistent anesthesia-related complications, wrong-site surgeries, wrong patient surgeries, retained foreign bodies and surgical fires. One study indicated that 3 percent of inpatients who underwent surgery suffered an adverse event, and half of these were preventable. Studies have also shown that there is a strong relationship between volume and safety. That is, surgeons need to perform any given surgery a certain number of times to attain a level of skill required to minimize adverse surgical events. Addressing surgical safety will require a number of measures, including widespread adoption of safety principles already largely implemented by anesthesiologists (e.g., systems thinking, human factors engineering, learning from mistakes, standardization and comprehensively applying the “Universal Protocols” — including site signing and time outs), along with teamwork training, checklists and the use of best practices for minimizing retained foreign bodies and avoiding surgical fires.


Diagnostic errors. While they have received less emphasis, diagnostic errors are relatively common. For example, in the study that served as the basis for the IOM’s estimate of 44,000 to 98,000 annual deaths from preventable errors, 17 percent of the deaths were attributed to diagnostic errors. Furthermore, autopsy studies have demonstrated that 1 in 10 patients suffer a major antemortem error. Addressing this problem will require a number of measures, including avoiding fatigue, avoiding overreliance on past experience, improved training in cognitive reasoning and computerized decision support systems.

Person-machine interface errors (human factors engineering)
. Human factors engineering is an applied science of systems design that is concerned with the interplay between humans, machines and their work environments. Its goal is to assure that devices, systems and working environments are designed to minimize the likelihood of error and optimize safety. As one of its central tenets, the field recognizes that humans are fallible — they often overestimate their abilities and underestimate their limitations. This is particularly important in the increasingly complex healthcare environment, where fallible care providers are being overwhelmed by increasing complexity. 


Many complex care environments have little or no support from modern technology for care providers, and in those that do have such support the devices often have poorly designed user interfaces that are difficult and even dangerous to use. Human factors engineers strive to understand the strengths and weaknesses of human physical and mental abilities. They use that information to design safer devices, systems and environments. Thoughtful application of human factors engineering principles can assist humans dealing with complex care environments and help prevent errors at the person–machine interface.

Errors at transitions of care (handoff errors). Transitions of care between care environments and care providers are common in clinical care. These handoffs are a common source of patient harm. One study demonstrated that 12 percent of patients experienced preventable adverse events after hospital discharge, most commonly medication errors. Because they are so common, healthcare provider organizations increasingly are focusing on this type of harm.
 

Policymakers are also paying more attention to this type of harm. In 2006, the Joint Commission issued a National Patient Safety Goal that requires healthcare organizations to implement a standardized approach to handoff communications including an opportunity to ask and respond to questions. Because of studies showing very high 30-day readmission rates in Medicare patients (20 percent overall, nearly 30 percent in patients with heart failure), Medicare began penalizing hospitals with high readmission rates in 2012. All of this attention has stimulated a growing body of research focused on handoffs and transitions. This research is providing a deeper understanding of best practices, which have both structural and interpersonal components. These practices include standardized communication protocols (including “read backs”) and more interoperable information systems.

Teamwork and communication errors. Medicine is fundamentally a team sport. There is an overwhelming amount of evidence that the quality of teamwork often determines whether patients receive appropriate care promptly and safely. There are many clinical examples of this, including the management of a cardiac arrest (a so-called “code blue”), a serious trauma case, a complicated surgery, the delivery of a compromised infant or the treatment of an immune-compromised patient in isolation.
 

While the importance of teamwork is widely accepted, the evidence that it exists and that team members feel free to speak up if they see unsafe conditions is not strong. Over the last three decades, the aviation industry has learned the importance of teamwork and implemented state-of-the-art teamwork concepts which have had a dramatic impact on safety performance  Healthcare patient safety advocates have appropriately turned to the aviation industry to adapt their teamwork concepts to clinical care.

In addition, the JCAHO has provided evidence that communication problems are the most common root cause of serious medical errors...
 

Well-functioning healthcare teams should employ appropriate authority gradients that allow people to speak up, utilize aviation’s crew resource training communication model (CRM), use effective methods of reviewing and updating information on individual patients, employ accepted strategies to improve communications including SBAR (Situation, Background, Assessment and Recommendation) and so-called “CUS words” (I am Concerned, I am Uncomfortable and I feel it is a Safety ssue) to express escalating levels of concern, and constantly maintain situational awareness.

Healthcare-associated infections (HAIs). Healthcare-associated infections (HAI) are infections that people acquire in a healthcare setting while they are receiving treatment for another condition. HAIs can be acquired anywhere healthcare is delivered, including inpatient acute care hospitals, outpatient settings such as ambulatory surgical centers and end-stage renal disease facilities, and long-term care facilities such as nursing homes and rehabilitation centers. HAIs may be caused by any infectious agent, including bacteria, fungi and viruses, as well as other less common types of pathogens.
 

These infections are associated with a variety of risk factors, including:
  • Use of indwelling medical devices such as bloodstream, endotracheal and urinary catheters
  • Surgical procedures
  • Injections
  • Contamination of the healthcare environment
  • Transmission of communicable diseases between patients and
  • healthcare workers
  • Overuse or improper use of antibiotics
HAIs are a significant cause of morbidity and mortality. The CDC estimates that 1 in 20 hospitalized patients will develop an HAI, that they are responsible for about 100,000 deaths per year in U.S. hospitals alone and that HAIs are responsible for $30 to $40 billion in costs. In addition, HAIs can have devastating emotional, medical and legal consequences.
 

The following list covers the majority of HAIs:
  • Catheter-associated urinary tract infections
  • Surgical site infections
  • Bloodstream infections (including central line-associated infections)
  • Pneumonia (including ventilator-associated pneumonia)
  • Methicillin-resistant Staph aureus infections (MRSA)
  • C. difficile infection
As they are to other common sources of harm, federal policymakers are paying attention to HAIs. The U.S. Department of Health and Human Services (HHS) has identified the reduction of HAIs as an agency priority goal for the department. HHS committed to reducing the national rate of HAIs by demonstrating significant, quantitative and measurable reductions in hospital-acquired central line-associated bloodstream infections and catheter-associated urinary tract infections by no later than September 30, 2013. The final results of this program are yet to be published.
By using a variety of well-tested policies and procedures, there is encouraging evidence that healthcare organizations can significantly decrease the frequency of HAIs.

Other sources of errors. There are a variety of other sources of significant patient harm in clinical care. These include patient falls, altered mental status (often due to over sedation), pressure ulcers and venous thromboembolism, harm related to inadequate staffing ratios, harm resulting from nonstandardization, errors due to lack of redundant systems, harm resulting from inadequate provider training, harm caused by caregiver stress and fatigue, etc.
Following that, I found this next item interesting:
The role of information technology and measurement in safety 
Advanced information technology is playing an increasingly important role in patient safety. Technologies involved include Electronic Health Records (EHRs), CPOE, clinical decision support systems, IT systems designed to improve diagnostic accuracy, analytical systems, bar coding, RFID, smart intravenous pumps and automated drug dispensing systems. It is important to note that skill is required to implement these systems in a manner that promotes safety while not increasing the rate of harm.
The most aggressive Health IT critics routinely pooh-pooh HIT, calling it "dangerous, unproven technology that kills patients."

Regarding my ongoing "workplace toxicity" rant of late,
There is an overwhelming amount of evidence that the quality of teamwork often determines whether patients receive appropriate care promptly and safely...While the importance of teamwork is widely accepted, the evidence that it exists and that team members feel free to speak up if they see unsafe conditions is not strong.
Indeed. Health care delivery, particularly in the acute care space, will continue to be an irreducibly high cognitive burden, intractably time-stressed environment. Add into that any significant level of undue workforce stress stemming from culture dysfunctionality, well, as I've argued repeatedly, you are not going to Lean/Six Sigma your way around it. And, teaching "critical thinking" skills is, perversely, likely to make matters worse in some instances (where one speaks truth to power at one's peril).

See my earlier post addressing the High Engagement Workforce, Just Culture, and Leadership.

Also from this book:


Only ~TEN percent of our wellness factors are the result of healthcare system interventions. See my prior post about "the Upstream Factors."

UPDATE

Regarding Health IT "Usability" (UX), from EHR Science:
[C]linical work is role-based, collaborative, non-linear and integrative. These attributes of clinical work must be reflected in software designs.

See Building Clinical Care Systems, Part V: Supporting Clinical Work

As more clinical groups make their wishes known, the next step is turning them into real software—no small feat. It is certainly not something I expect the average EHR vendor to tackle single-handedly. The cost and resources required would be too much because there are so many basic research issues here. There are no models for clinical work and no reference user interface designs. Turning desired features into real software will require deep, long-term collaborations between clinicians, informaticists, software engineers, workflow specialists, usability experts, and many others. It is certainly more involved than adding a few features to current systems, or exchanging electronic documents.

So much time and energy have been put into systems conceived as electronic replacements for paper charts that we have lost track of the fact that care delivery, not updating a chart [emphasis mine -BG], is the goal of clinical work. Electronic charts have their place, but support for clinical work requires more...

Yeah, but, how long will MU, the proxy CQMs, and "productivity treadmill" tails continue to wag to clinical dog?

CODA

"Big Data"? LOL...


No comments:

Post a Comment