Putting the Human Back in Healthcare
We’ve probably all been there. You’re talking with a doctor or nurse about a health concern and the medical professional barely looks up from their computer or tablet screen. After you go through all the details of your health issue, they offer advice that doesn’t fully acknowledge your specific concerns. It feels like they weren’t listening.
Tech companies are eager to say this is a problem that AI can solve. There probably is a role for technology to play, whether through transcribing interviews, assisting in diagnoses, or recommending treatment. But the situation also presents an opportunity for human workers in the healthcare space to refocus on what truly differentiates them: empathy.
When AI first started gaining popularity, a common refrain was that it was good at data entry tasks but would never find purchase in sensitive areas like direct communication with patients. But as large language models have improved in their ability to interact with people in natural, sometimes even seemingly empathetic, ways, that’s starting to change.
A recent study published in the journal Communication Psychology found that people rate responses to their mental health questions as more compassionate when they come from AI bots compared to actual mental health professionals. This is the case even when respondents know they are talking with an AI bot. On the one hand, this demonstrates the dramatic progress that LLM-based chat tools have made in recent years. On the other hand, it raises the question of why some human counselors don’t match their virtual peers in this most human of qualities.
The report’s authors say it’s not that they can’t be more compassionate and empathetic. It’s likely that they’re conserving their emotional resources to avoid burnout amid the numerous demands on their time and attention throughout the day. Approached thoughtfully, AI could potentially alleviate some of the burnout concerns for healthcare staff and empower them to bring their full compassion to their jobs. AI could handle some of the simpler tasks, allowing workers to save their emotional resources and apply their deeper skills to more complex cases. In that case, it would be interesting to see how patients rate the compassion level of human mental health workers who are not burdened by administrative tasks or an overflowing list of scheduled patient visits.
We’re already seeing this model play out successfully in other domains. For example, we detailed in this year’s Tech Trends report how engineering and software development teams are rolling out AI code assist tools broadly across their teams. These tools typically handle low-level tasks like patch fixes and updates, which allows experienced developers to spend more time on innovation and other higher-level tasks.
Whether we’re talking about healthcare, information technology, or any other industry really, AI is most useful as a force multiplier. It can work around the clock and quickly handle simple, straightforward cases.
The key, though, is making sure AI doesn’t become one more thing the already overburdened workers in the healthcare space have to contend with during their day. If AI becomes nothing more than a source of false alarms and erroneous advice, as some healthcare providers are already warning, it will have the opposite effect, leaving medical professionals more burned out and less able to bring their full empathic selves to the job.
Some providers are already finding ways to strike the right balance and finding ways to implement AI tools that are helpful and non-obtrusive. I recently visited a doctor and saw firsthand what I think is the model for how AI should work in healthcare. The doctor sat a tablet down on a table and told me it was recording our conversation. He explained that an AI tool would transcribe the visit and listen for keywords related to things like medical history or medications used. It would then automatically update any structured data fields in my record, in addition to providing a full transcript of the visit. With the AI tool quietly taking care of all the necessary administrative record-keeping functions in the background, the doctor and I were then free to have a conversation, as two humans.
-Ed Burns, Deloitte OCTO Journalist