It's important to have clinical leadership in the room from the beginning while also educating execs on the strengths and limitations of AI and large language models, noted Kaiser Permanente's Brett MacLaren. This year’s HIMSS conference was lousy with generative AI, machine learning innovations and announcements. While the tech-anchored keynotes and business strategies generated a lot of buzz, many organizations still struggle with operationalization and scale.
During panel discussions, healthcare executives shared what they have found to be best practices for operationalizing AI and scaling up machine learning initiatives. Many experts emphasized that hybrid growth models are expected and normal as companies grow their tech. They also implored tech teams to have clinical leadership and operators in the room from the beginning to help with model validation.
“Every decade, there’s some breakthrough technology,” said Shawn Wang, chief AI officer at Elevance Health (formerly Anthem). “A lot of companies rise and fall with that technology. Generative AI is that technology. This is not something that will just disappear or flash, make a lot of promises or hype and not meet expectations. This will go bigger and bigger," he said while moderating a panel at the conference on operationalizing AI. Wang began the panel with an experience from his own work. He was presented with the challenge of facilitating informed, novel connections between patients and providers. “Think about it like Match.com,” he said of the algorithm he designed.
Wang shared four tips as part of his "recipe for success" in AI operationalization and scaling. For ingredient one, he said to keep an eye on metrics, customer metrics and AI metrics. Run AI as a business by knowing who your customer is, create top-down and bottom-up alignment and move the business to the driver’s seat, he said.
The last point his team failed at by designing their solution from a tech perspective as opposed to a business perspective. Health system management called out the team and they were forced to backtrack, he noted.
Next, focus on the critical few areas where AI has opportunities; start with the problem as opposed to the solution and address the first and last miles of solving the problem. For example, with the matching project, Wang’s team initially failed to include provider feedback to enable better problem-solving.
“You got to monitor the results continuously after deployment and make sure that it continues to meet expectations,” Wang said. “The alignment of key metrics is so important to measure continued success.”
Then, manage AI solutions as products by having an interdisciplinary AI team, create a supply and demand loop from innovation to scale and remember that integrating into workflows is necessary to create value. His team included AI experts, nurses and clinicians and call center stakeholders.
The fourth ingredient, Wang said, is around the use of large language models. While it's one of the biggest recent breakthrough technologies, LLM must be met with responsible usage and continuous learning while also appreciating the challenge of scaling.
The project Wang shared took two years. When it comes to connecting primary care providers to new patients, the system ultimately led to 10,000 patients being connected with the right providers.
“Often when we dig into a data science opportunity, one of two things happens,” said Derrick Higgins, head of data science and AI solutions at Blue Cross and Blue Shield of Illinois. “One, we find that it's not a data science opportunity. Or two, there are multiple opportunities for using data science.”
When entering the design space, Higgins said the way to bring value most quickly is to integrate with the workflow as it exists. For example, in making claims processing more efficient and accurate, his team created machine learning models to predict when a claim was going to be complex and prone to errors. Without drastically changing workflow, specific sections of claims were highlighted to invite careful review.
It's important to have clinical leadership in the room from the beginning while also educating execs on the strengths and limitations of AI and LLM, noted Brett MacLaren, Kaiser Permanente’s senior vice president of data and analytics. “When we're putting something into production workflow, we’re very clear about the AI’s level of accuracy and the amount of risk that the physician or clinician is potentially taking,” MacLaren said. “The clinician is still making the call. They're still saying that based on this information, this tool is making it easier for me to make the right decision but it's not making the decision for me.”
Atif Chaughtai, global healthcare market leader at Red Hat, suggested that one way to input AI into clinician workflow without creating undo risk is making the tech additive. One example was creating a sepsis detection algorithm. By looking at lab results, vitals and medications, nurses and physicians are alerted by the AI when a patient is at risk of developing sepsis. “This is additive technology,” Chaughtai said. “It’s not replacing but supporting physicians’ decisions. The reason they were able to scale and adapt to that is because it provided additional value and reduced the noise for the nurses and the physician at the center. It has saved thousands of lives.”
In a more indirect use of AI, Sapphire Health used predictive analytics to suggest locations for mobile vaccination stations during the pandemic by highlighting the communities most at risk. Austin Park, CEO and founder of Sapphire Health, said the company was able to leverage cloud infrastructure and preexisting partnerships, including tech partners.
The panelists agreed that the path to implementation of AI comes in fits and starts as problems arise that can be solved with new technology. Hybrid tech architecture is where most health entities are, Chaughtai said. Strategic implementation is paramount as new tech grows.
“We have a lot of other data, it's just not organized the way we need it to be,” Park said. “And so there's a lot of work that we have to do on that.”