Dr. Dan Low: The data 'every team should be leveraging' for surgical improvements

Researchers at Seattle Children's and the University of Washington in Seattle published a series of papers examining opioid-free surgery in the context of the opioid crisis. 

Dan Low, MD, chief medical officer of MDmetrix and one of the research authors, told Becker's ASC Review how real-world outcomes data can be harnessed to improve quality.

Note: Responses were lightly edited for length and style.

Question: What role does data play in anesthesia quality improvement projects? What kind of data is essential?

Dr. Dan Low: Having access to real-world data from an ASC's own practice environment is critical for both planning and implementing quality improvement projects. In other industries, access to data and data analytics enable rapid improvements. Could you imagine retooling an auto assembly plant to build cars differently without first having visibility into quality and efficiency? In healthcare, ASC leaders and frontline clinicians alike find it challenging to fix what they can't see.

Anesthesiology, in particular, is a specialty that generates large amounts of structured data that is captured in EMR. Examples of such routinely collected data elements include drugs administered, doses, routes, timing, recovery time, pain score and opioid requirement. All surgery specialties, though, generate a great deal of clinical and workflow data. EMR systems, however, were not designed to enable clinicians or their leaders to easily use the data needed to drive improvement work. So, most ASCs' real world data is collected but not used.

Q: What does the Plan-Do-Study-Act cycle look like in practice?

DL: Plan: A group starts with the mindset of wanting to improve outcomes for a particular surgery. They typically start with reviewing the evidence in the literature, looking for any innovations and approaches that have been described and might be applicable. They then need data to establish their baseline performance across a "family of measures" (e.g., pain scores, recovery time and postoperative nausea).

Do: The team changes their protocol.

Study: The team reviews data to understand if outcomes have (or have not) improved across important metrics.

Act: If outcomes improve, the team has the choice to scale the protocol change. If there is no evidence that improvement has occurred, the cycle repeats again.

Q: How can PDSA be replicated by other anesthesia teams embarking on quality improvement efforts?

DL: Historically, the greatest barrier to launching improvement cycles has been the great difficulty in getting visibility into applicable data. Mining by analysts across EMR data is an extremely manual and time-consuming process, often taking six or 12 months. As a result, the activation energy required to ask clinical questions — let alone launch improvement cycles — has been very high, so improvements tend to be slow and infrequent.

As a key part of our study, we used MDmetrix software to reduce our data extraction and visualization time from months to minutes. We were able to shorten our PDSA improvement cycles, leading us to modify and evaluate our protocols in weeks rather than in years. This study is an example of how rapidly a team can learn and improve when they have ready access to real-world data. In this case, we successfully reduced intraoperative opioid administration rate from 100 percent to zero within a year. All measurable outcome metrics were stable or improved.

Whatever technology a team chooses to use, every team should be leveraging their clinical and workflow data to accelerate their PDSA improvement cycles.

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Articles We Think You'll Like

 

Featured Whitepapers

Featured Webinars