FEATURE · ANALYTICS/AI
rate was somewhere around 85. And the boundary was if it wasn’t clear, send it to the provider. If you’re not sure, that’s the default.” Hasselberg admits that, “Had you come
to me six months ago to a year ago, I would have told you that I was really pessimistic around the applicability in the near term of AI in healthcare, because we have a data problem in healthcare: our data is very dirty and very noisy.” But the success in the past couple of months of this experi- ment has changed his view.
Experts express caution: it’s still early days There remains widespread caution among many leaders in healthcare, given the high expectations around AI, and particularly around ChatGPT, and the hype in some
is
Notable Health, while he remains an associate professor of medicine in the Division of Endocrinology in the School of Medicine at the University of California, San Francisco, and also holds a position on the Health Information Technology Advisory Committee (HITAC) at the Department of Health and Human Services/the Office of the National Coordinator for Health IT (ONC). “My lens and organizing principle or
quarters.
Aaron Neinstein, M.D., who holds multiple profes- sional titles,
Aaron Neinstein, M.D.
now chief medical officer at the San Mateo, Calif.-based
philosophy around this is that, yes, I’m excited about where AI could evolve forward in the clinical realm in the future. I do think that that’s years away,” Neinstein says. “Right now, we’re getting into appropriate concerns around trust, bias, responsibility—basically, an FDA- regulated territory, for good reason. And I keep coming back to Eric Topol’s quote in his Deep Medicine book; paraphrasing what he said there, we don’t need AI to cure cancer, just to help doctors and patients restore their relationship. We’ve overloaded doctors, nurses, clinic staff: everyone’s totally overwhelmed and burnt out, with too much to do.” Indeed, he says, “Someone did a study finding that if primary care physicians really did
everything they were supposed to do in a day, it would take 30 hours to do. And there’s some element of that in every job in healthcare. So I really see the primary job of AI to help start lifting those burdens away, in all those different places in differ- ent workflows. We need to find all those pieces and start using AI there first.” Per ChatGPT specifically, Neinstein
emphasizes that “You definitely cannot use it out of the box, because of privacy and security issues. Health systems are starting to deploy more secure versions of GPT,” he comments. He believes that, over time, ChatGPT will be trained for context, and will be used in a number of different situations. “For example, the context of the patient’s record that you’re looking at. For me, it would be useful in helping me gen- erate summaries of patient instructions or patient educational content or material. To do that well, it would need some training in the content of what I usually talk about. And when you lock it down to preserve privacy and security, it won’t do that. So there will sort of have to be mini spinoffs.” Inevitably, Neinstein says, leaders in
patient care organizations will have to integrate the uses of ChatGPT and other
Congratulations, Dr. Emily Lovallo!
As a Healthcare Innovation 40 under 40 winner, UPMC and UPMC Health Plan applaud your compassion and leadership to serving those in need.
Congratulations to all of this years honorees!
2309HI_UPMCHealthPlan.indd 1
SEPTEMBER/OCTOBER 2023 |
hcinnovationgroup.com 11 9/6/23 8:22 AM
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36