A Tragic Air Crash Helps Define HCIT Safety Needs (Part 2)

April 9, 2013
In Part 1 of this series, we reviewed the crash of Air France Flight 447. I noted that from the final report of the tragedy, I developed eight factors that contributed to the loss of everyone aboard that I believe can be directly related to Clinical Decision Support in healthcare IT. Now, let's explore the first four points in-depth to learn how they really do apply to HCIT CDS.

Healthcare Safety Lessons from the Inter-Tropical Convergence Zone

In Part 1 of this series, we reviewed the crash of Air France Flight 447.  I noted that from the final report of the tragedy, I developed eight factors that contributed to the loss of everyone aboard that I believe can be directly related to Clinical Decision Support in healthcare IT.  To refresh your memory, here is the list:

   1.  Sensor failure precipitating a lethal cascade

   2.  Sudden autopilot withdrawal

   3.  Team competence dynamics

   4.  Black box incident reconstruction

   5.  Real time management

   6.  Physics and physiology, the Coffin Corner

   7.  Safety regulation

   8.  Privacy and individual rights

In this installment we will expand the first four factors in-depth and how they relate to HCIT Clinical Decision Support (CDS). 

1.  Sensors: Adequate input data reliant on “sensors,” in this case Pitot static tubes, which capture air speed.  It was known that the Pitot probes required heaters to ensure operation during freezing temperatures.  Unfortunately, these heaters were scheduled to be installed at a date that was too late to prevent the crash of AF447. 

The application of this concept to healthcare IT is broad and complex.  For one, it includes information that is needed but never arrives, perhaps because the integration of information is incomplete.  This is a common situation today.  Let’s also consider information that arrives too late.  This is also common, occurring when such information is entered in another system that communicates in a batch mode after some final event occurs, such as a signature or validation.  It also includes information that may arrive on time but is wrong, as was the case with AF447. 

To reason over information when you know it contains errors takes a lot of added sophistication, because you don’t know where those errors are located.  There are ways to do this using a “belief” function, and you can learn more by reading about the Dempster-Shafer Theory used in some of today’s HCIT CDS systems.

The bottom line here is simple.  As we rely more on HCIT, the risk of making mistakes due to missing, late, and wrong information will increase.  Our practices will need to improve to mitigate the risks.

2.  Autopilot Withdrawal: The use of an automatic system (autopilot), which ceased functioning when starved of adequate input data.  

The kinds of “laws” or flight control modes used for autopilots in aviation have no real parallel in healthcare IT today.  This is the case because, with a few notable exceptions, we haven’t automated routine decision making in a comparable way.  However, in chatting with noted healthcare safety expert David Classen last month, he explained a concept called “Clumsy Automation” he and others have written about for more than a decade. 

One characteristic of this kind of automation is it can make easy situations easier to manage, and harder situations even more difficult to manage when there are problems.  There are more discussions and relevant recommendations in David and Dr. Peter Kilbridges’ article from the Journal of the American Medical Informatics Association. For clarification, consider this working definition from the Harvard Journal of Law and Technology:

“The term ‘clumsy automation’ was coined by E.L. Wiener to denote the role awkward systems often play in provoking human errors in such technologically complicated areas as commercial aviation.  Awkward interfaces occasion error by increasing rather than diminishing the cognitive workload of human operators at times when they are preoccupied with other tasks demanding attention.  Operational failures often stem from interfaces that are not compatible with the finite cognitive capacity and competence of a technological system's human overseer.”

The bottom line here is simple, too.  As we rely more on HCIT to synthesize and make recommendations, there will be situations where “doing the right thing” falls through the cracks.  Our practices will need to build in enough time and human expertise to reason over the decisions we make.

3.  Teams: The critical role of teams to solve problems, and the fact that most of the junior pilots were blamed for “not acting swiftly enough.”  In the final crash report considerable attention was focused on the issues caused by the “PF” or pilot flying, and communications with the broader team. 

The issues of teamwork during medication ordering, dispensing and administration, or teamwork in the operating room are well known.  The AF447 crash reminds us that the most junior members of our professional team may be in the driver’s seat when things go wrong.  Therefore, HCIT design, implementation and simulation need to explicitly consider what is “swift enough” for the common scenarios in healthcare, especially when the safety index is narrow.

Therefore, the bottom line here is that, as we rely more on HCIT, staffing needs to be situationally appropriate where possible.  Our practices will need to focus more on matching provider skill levels with patient acuity, and ensuring timely communication of vital information to the most appropriate members of the care team.

4.  Black Box: The role of the black box and comparable logs in general to sort out what happened.  

Can we, in healthcare, expect to use log files to adequately deconstruct sentinel events after they have occurred?  Should we routinely capture (and create transcripts of) the comments of our care teams? 

Over the next decade, technology may evolve.  Dr. David Blumenthal, in an AHRQ interview, called this a “possibility” when describing the deeper implications of wiring American healthcare.

Blumenthal said, “For all I know, we will have such capable natural language processing that people will never look at a keyboard 10 years from now.  Clinicians will just be talking to their patients and the whole thing will be recorded, synthesized, and translated into a medical record effortlessly.”

Although we’re clearly many years away from deep integration of video recording in the patient interview and exam room, the logging of a growing number of care events is happening today as a byproduct of automation.  The sequence of events and the process of documentation are increasingly capturing more data to infer the intent of providers and patients.  Will this kind of data be used to evolve to safer systems in healthcare?  The bottom line is that such data may be essential to ensure a robust care delivery system.

The conclusions drawn by French regulators are at the crux of the issues surrounding the fatal crash of AF447.  But, I stress again, as in Part 1, that the broader issues—what took place months before and months after—are perhaps even more prescient to the grand strategic vision for employing HCIT in pursuit of the triple aim goal to provide for better health, healthcare and cost control.

In Part 3, we will review the last four factors in the crash as they relate to HCIT CDS, and draw some final conclusions.  In the meantime, what are your thoughts so far?

Joseph I Bormel, MD, MPH

CMO and Vice President

QuadraMed Corporation

[email protected]

Sponsored Recommendations

A Cyber Shield for Healthcare: Exploring HHS's $1.3 Billion Security Initiative

Unlock the Future of Healthcare Cybersecurity with Erik Decker, Co-Chair of the HHS 405(d) workgroup! Don't miss this opportunity to gain invaluable knowledge from a seasoned ...

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...

Spotlight on Artificial Intelligence

Unlock the potential of AI in our latest series. Discover how AI is revolutionizing clinical decision support, improving workflow efficiency, and transforming medical documentation...

Beyond the VPN: Zero Trust Access for a Healthcare Hybrid Work Environment

This whitepaper explores how a cloud-enabled zero trust architecture ensures secure, least privileged access to applications, meeting regulatory requirements and enhancing user...