How Doctors Use AI
In the Elsevier survey, 95% of healthcare providers agreed that AI could benefit their practice, but only 16% were actively using it. Limited availability and lack of training are the biggest barriers. “For healthcare providers, AI can be transformative,” says
Hasselberg. “But unlike AI generated for consumers, we must base our AI on guidelines and parameters. “We need guardrails to make sure we are doing no harm. For
routine use in clinical decision making, we need more time, but we will get there.”
How Patients Use AI According to a 2019 Pennsylvania State University study, 22% of health consumers had used an online symptom checker in the previous year. Experts believe that number to be higher now. For consumers, AI tools
checkers that only offered vague, generic suggestions. Most AI symptom checkers
now use a chatbot format. A chatbot is a discussion between a real person and a computer-generated provider. For example, the popular
are widely available in the form of symptom checkers and chatbots. These have largely replaced old- fashioned online symptom
chatbot symptom checker ChatRx allows you to talk with “Dr. Tod.” ChatRx claims to provide “a doctor in your pocket,” and “healthcare
for modern times.” Today’s AI symptom
checkers collect detailed personal health information, review electronic medical records, and can even provide doctors’ notes, prescriptions, and referrals. Popular apps include Ada,
K Health, Babylon, Buoy, Isabel, Symptomate, and WebMD. “These AI-powered
tools are much better than
traditional symptom checkers, but they are still a black box because we don’t know where they are getting their data,” Hasselberg explains. “Are they using data
from the internet, or are they using evidence-based guidelines? There is a lot of bad information on the internet, which can lead to very confident misinformation, what we call AI confabulation or hallucination.”
Accuracy and Risks Studies suggest that pre-AI symptom checkers were accurate only 30% to 40%
of the time. New AI versions are much better, but far from perfect.
A 2024 review in the Journal of Artificial Intelligence & Cloud Computing found AI tools increase access to care, reduce unnecessary doctor visits, and help patients make informed decisions. The American Medical Informatics Association (AMIA)
reviewed Ada, K Health, and Your MD, noting they build patient histories, evaluate symptoms, suggest likely diagnoses, and recommend follow-up testing or care. A 2024 review, published in the Journal of Medical Internet
Research, analyzed the accuracy of the AI-powered symptom checkers Babylon and Ada. Researchers found that Ada scored best, with correct diagnoses about 70% of the time, while Babylon scored closer to 30% — compared to 80% for human doctors.
“ Read the small print and consumer disclosure information. Is your patient data being shared? Has the tool been studied for accuracy? Does it say where it gets its data from?”
— Michael Hasselberg, Ph.D.
THE TAKEAWAY: AI tools are helpful, but they are not a substitute for professional care. “Patients need to know that AI is still a machine,” says
Hasselberg. “Clinical medicine is more than algorithms and data points. Human clinicians add experience and intuition. Only humans can do that. Clinical medicine is as much art as science.” Most AI-generated diagnostic tools carry a written warning that their advice does not constitute a true diagnosis and should not replace a diagnosis from a licensed human healthcare provider. Approximately seven out of 10 providers say patients have failed to follow prescribed treatment due to AI-generated misinformation, and more than half find they are spending ofice time correcting false or misleading information.
Should You Use AI ?
AI tools can be a great starting point, but experts stress the importance of caution. “If you want to use an online AI symptom checker or download a chatbot diagnosis
tool, you should do your homework and due diligence,” Hasselberg advises. “Read the small print and consumer disclosure information. Is your patient data being
shared? Has the tool been studied for accuracy? Does it say where it gets its data from?” A generative AI for patients can be a great place to start. It can help you learn about
your condition and help you make decisions along with your doctor, but it is not a replacement, explains Hasselberg.
DECEMBER 2025 | NEWSMAX MAXLIFE 93
DOCTOR/LU SHAOJI/GETTY IAMGES
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72 |
Page 73 |
Page 74 |
Page 75 |
Page 76 |
Page 77 |
Page 78 |
Page 79 |
Page 80 |
Page 81 |
Page 82 |
Page 83 |
Page 84 |
Page 85 |
Page 86 |
Page 87 |
Page 88 |
Page 89 |
Page 90 |
Page 91 |
Page 92 |
Page 93 |
Page 94 |
Page 95 |
Page 96 |
Page 97 |
Page 98 |
Page 99 |
Page 100 |
Page 101 |
Page 102 |
Page 103 |
Page 104 |
Page 105 |
Page 106 |
Page 107 |
Page 108