The AI Therapist Will See You Now… Or Will It?
By now, you’ve probably heard the sensational headlines claiming AI is here to replace doctors, therapists, and basically anyone with a pulse. But before we all jump ship on our stethoscopes and DPT degrees, let’s take a moment to see what’s really going on – especially in telehealth and beyond.
The Good: AI as a Co-Pilot
Let’s set the record straight: AI can do a lot of good in healthcare. Teladoc (side note: am using Teledoc for a reason which will become obvious, there are TONS of very good AI scribe and other services in healthcare and #physicaltherapy specifically which I have highlighted in the past and hope they reply to this newsletter) for instance, is rolling out AI-driven documentation tools that help overworked providers automate their note-taking, freeing them up to focus on patient care instead of endless paperwork. Hospitals have also found success with AI-powered “Virtual Sitters,” which monitor patients and alert staff to potential falls – an especially big win for patient safety.
But that’s just scratching the surface. AI is making waves in:
• Medical Imaging: Advanced algorithms help radiologists catch diseases like cancer earlier and with more accuracy.
• Dental Imaging: Machine learning and algorithms keep dentists in check by helping them diagnose and support treatment plans with insurance companies likely using as a basis for reimbursement.
• Diabetic Retinopathy Detection: Automated screenings reduce the risk of blindness for millions of people worldwide.
• Predictive Analytics: Hospitals use AI to manage patient flow, shorten wait times, and allocate resources more efficiently.
• Remote Accountability in Rehab (RAR): As we are growing in familiarity and acceptance, AI-enhanced physical therapy apps and wearables track patient movement, ensuring they’re doing exercises correctly – when they are not in the clinic. I call this RAR as it is an adjunct to a physical therapist, not a replacement. However, many are selling remote rehab as a replacement including tons of employers who have bought into this same as an “MSK benefit” for their employees. Have previously gone into some detail and predictions in prior post.
• ECG and Cardiac Monitoring: Machine learning systems can flag irregular heart rhythms in real time, aiding cardiologists in faster, more precise diagnoses.
• Medication Management: AI-assisted platforms help pharmacists identify potential drug interactions, making prescriptions safer.
• Genomic Data Analysis: Clinicians can use AI to sift through genetic information and pinpoint personalized treatment plans for conditions ranging from cancer to rare genetic disorders.
In all these scenarios, AI is a helping hand, not a replacement. It’s about saving time, boosting accuracy, and expanding access to care.
The Bad: AI-Powered Therapy Chatbots (That You Didn’t Know Were Bots)
So where do we run into trouble? Enter the recent controversies. Allegations claim that BetterHelp, Teladoc’s mental health branch, has therapists who rely on AI-generated text – without telling patients. In some cases and as detailed in this report, therapists have come clean about using ChatGPT to bulk up their replies because their pay structure rewards them for longer messages.
Let’s unpack that. AI itself isn’t the bad guy. The real issue is the business model. When profit margins hinge on churning out more (and longer) responses, AI becomes a quick way to hit a quota. The result? Patients who think they’re getting a carefully crafted response from a human therapist may actually be conversing with AI in disguise.
The Takeaway: AI Isn’t the Problem – People Are
Here’s the heart of it: It’s a lot like the old saying, “Guns don’t kill people, people do.” A tool by itself isn’t inherently dangerous; it’s what people do with it that counts. AI, like a firearm, is neutral until opportunistic companies or individuals decide to bend it toward profit-driven ends. When financial incentives are misaligned and corners get cut, quality of care suffers – and patient trust takes a hit.
AI can help clinicians become more efficient and extend care to more people, but only if it’s implemented ethically. Cutting corners for the sake of profit is where things go off the rails.
Stay Smart, Stay Skeptical
For those of us in physical therapy, AI isn’t gunning for our jobs just yet. But with AI creeping into more corners of healthcare, it’s on us to keep an eye on how it’s used, to speak up when it’s misapplied, and to ensure it truly serves patients – not just profit margins.
What do you think? Does AI belong in therapy, or have we already gone too far? Let’s talk it out! (real, not AI!)
larry
@physicaltherapy