Impact of artificial intelligence-based clinical documentation tools on clinical workflow

Oct. 10, 2025

Clinicians spend a significant amount of time recording clinical notes during and after patient visits. This documentation is an important part of maintaining accurate and up-to-date electronic health records (EHR). But the time spent looking at a computer screen or other devices during patient visits can interrupt clinicians' efforts to connect with and focus on their patients.

Ambient listening technology tools, also known as ambient artificial intelligence (AI) medical scribes, use speech recognition, natural language processing and large language models to record and summarize patient-physician conversations. The goal of this type of AI tool is to reduce the burden of clinical documentation and to allow clinicians to spend more time focusing on their patients throughout each visit. While the individual services and features included in the currently available tools vary, most offer the ability to record and summarize patient-physician dialogue, organize it into discrete sections, and integrate this documentation into the EHR.

In this Q&A, Paul M. Scholten, M.D., describes how the use of ambient listening technology has impacted his clinical workflow. Dr. Scholten is a physiatrist with subspecialty training in pain medicine who serves as chair of the Mayo Clinic Platform and Innovation within Physical Medicine and Rehabilitation in Rochester, Minnesota.

What were your initial impressions about how this type of tool might fit into your workflow, and what are your thoughts now that you've been using it for several months?

I have to say, when I first heard about these products being in the pipeline, maybe five years ago, I was extremely skeptical of how they might work. One of my concerns was that I'd have to be very diligent about stating which portion of the encounter I was collecting information about. But the tool is actually very good at summarizing things and organizing them. Oftentimes, I'll get to the end of my encounter with a patient and collect some new historical information that gets categorized and placed back in the history where it belongs. It integrates with our EHR where suggested content from the conversation can be inserted into our notes. I've been really impressed with its ability to do that type of task.

Overall, the tool has significantly decreased my cognitive burden. As I started to use it more frequently, I find myself just fresher for the rest of the day. I'm not so focused on taking a lot of notes. And I don't have to end an encounter with a patient, go back to my computer or workstation to dictate that same story directly back in the computer and then edit it down. By decreasing that note-taking burden, I also think it allows you to give the patient more undivided attention.

What are some of the more-detailed features that you find helpful?

The tool I use identifies which information is important within your conversation with the patient. It does a really good job of taking what can be a very convoluted and disorganized history as described by the patient and summarizing it at a high level in an organized fashion. Also, the tool allows me to review a transcript of the encounter as well as "linked evidence" that was used to produce a specific portion of the summary. This can be helpful if I find gaps in the summary that I need to fill in with details I discussed with the patient that I can't remember.

What additional developments are you hoping that this tool will incorporate in the near future?

The tool that I'm using omits what it determines are likely to be medically irrelevant parts of conversations. I think in PM&R, we're oftentimes asking relevant questions about social history and home environment that sometimes get edited out. And so going back and reviewing the note generated by this tool is really important.

The tool also doesn't currently support verbatim mode that allows me to dictate into it. And it does not understand verbal commands or reconcile what's discussed with the patient with what's already in the EHR. Because I see Spine Center patients, for example, it's important for me to know what injections they've had in the past. Patients will often say, "Oh, I had an epidural," and I want to know the level at which that epidural was done.

Currently, the tool I am using can only generate content based on what is discussed. So if the patient doesn't know, or it's not brought up or reviewed during that encounter verbally, it's not going to end up in my note. I think adding those types of features is likely to be the next biggest step. And I've heard that there are things in the pipeline to bring that functionality to this tool.

For more information

Refer a patient to Mayo Clinic.