

Hello, BTK listeners. I'm Nicole Peka, a general surgery resident at Emory University and one of the behind the knife surgical education fellows. I'm thrilled about today's episode where we'll be diving into ambient listening, ai, how it's currently being used, where it's headed, and how we can improve it.
Now let's introduce our guest. Our first guest is Dr. Samuel Torres Landa Fernandez. Sam attended medical school at UAQ in Mexico. He then moved to Oregon where he completed his residency training at Oregon Health and Science University. Sam is now in Atlanta, Georgia, where he's a minimally invasive surgery fellow at Emory University.
Up next we have Dr. Nick Panagopoulos. Nick completed his medical school at a UT in Greece. He then moved to Atlanta, where he's currently an internal medicine resident at Emory University. And last but not least, Dr. Joe Sharma. Dr. Sharma completed his medical school at the University of Alabama. He then moved to Atlanta where he attended Emory University for both.
Residency and fellowship training,
and Dr. Sharma now serves as a McGarrity professor of Endocrine Surgery at Emory. And our Vice Chair of Quality, Dr. Sharma, Sam. Nick, welcome to Behind the Knife. Thank you for having us. Thanks. Thanks for having us. I think the most natural places to start in the outpatient setting, it's kind of easy to envision how we use this in clinic, is we have multiple visits and there's so many notes to write.
Sam, can you tell us a little bit about how you're using ambient listening AI in the clinic? Absolutely. I've been using it about six months now. I'll go through how I go through the day before I see a patient. Of course, you know, I have a template with a dot phrase and I fill out all the, you know, I review the patients, I fill out images and so forth, and to have to have an idea of what I'm gonna be dealing with.
So right before you go into the room, I click on it and then it starts recording. I get into the patient's room and the first thing I do is I explain to the patients and I ask permission, and then you just start your normal, your normal conversation with the patient. After the visit, I typically pause the application
and then go, I chat with the attending, and then I come back either with the attending or just myself, and then I reclick play so that it continues the recording, and then at the time of the end of the visit, I just click pause and end, and then it has a period of time.
A few minutes from the time that you click end until the time that it gets recorded into the system for you to upload into the, into the chart. So I finish the visit, I go sit down on the computer, and then I have a dot phrase for history of present illness, and then a dot phrase for the plan. You have the ability to use it for the physical exam.
I don't do it currently, but there's that option. Once you click the dot phrase for the HPI, it basically fills it out. That sounds easy enough. I think the burning question is how often do you have to change the notes? Are, are they accurate just based on the recording or do you have to do a bunch of editing?
I. I have to say that from the time that I started using it until now, it has significantly
changed and it has gotten better. Initially, I would have a long conversation and I would, you know, bring up two paragraphs, which was clearly not the conversation that I had, but I've noticed that it has significantly gotten better.
The other thing is, of course, you have to be changing certain things, right? It adds things that maybe you don't, you wouldn't include. And, but also it has catch up certain things that I was like, oh, I didn't actually put attention to that particular thing. So, I think for that reason that, you know, that's, that's useful, but it requires some changing after the fact.
Yeah, to my experience, I feel that there is at least some degree of editing that needs to be done. At this point, I realized that there is some things you can do to troubleshoot that or prevent that from happening. For example, you can, um, improve the accuracy of the final output by reiterating in a clear way the main points of the patient story or the main points of the, um, HPI that you get.
This really helps the. Software to
get a better idea of where you want to focus on your note, and it also helps with the clarity of the final output. The other thing that I found helpful is porting the main findings of the physical exam to the patient while I do my physical exam. And this also helps the software to pick it up and organize it in a good way for for the note.
Yeah, those are great tips, Nick. So moving on to the assessment and plan. I've heard that sometimes this is the hardest part. 'cause any diagnosis that's mentioned during the visit comes into the plan at the end. And sometimes as surgeons, we don't want all those chronic conditions. In our plan, we have a very focused plan on what we're managing.
So Sam, how does Ambient listening AI do with the assessment and plan for the plan? Basically any diagnosis that you mentioned during the visit. It creates the diagnosis and creates a plan. You know, if you ask about hypertension or diabetes or all the, all the diagnosis that you mentioned, it would create one and then it would say, you know, continue with current management or something like that.
Where I spend probably the most time is when we're talking about like the nuance of surgical decision making. It, of course it brings like very basic information and I think that's where I spend the most time changing it, but. If I could give you an estimate of how long it takes me from the time that I see the patient to the time that I finish the note, I would say probably a couple minutes.
So it gives me the opportunity to, I. Finish the note before seeing the next patient, which, you know, I think that's one of the biggest values. Yeah, that's great. Notes can really snowball on you if you aren't able to complete them between the visits. So being able to do that with ambient listening, AI is a huge benefit.
I. Now I do wanna compare our surgery notes to the internal medicine notes. So I know in surgery we have a very focused and specialized plan based on what we're managing. However, in internal medicine and our primary care colleagues, they're looking to really capture all of those chronic conditions and get a comprehensive picture of the patient.
So, Nick, could you talk a little bit about
how Ambient listening AI works for the assessment and plan and the internal medicine clinics? Yeah, this is not an issue for the software. I do to, you know, wrap up the visit by mentioning its problem of the assessment and plan separately, and giving like a brief like assessment statement and a brief idea of what we're gonna do about it to the patient.
And at the same time, software can pick it up and help organize that in a nice way. Great. It sounds like you can really tailor how much information you want in the plan. So we talked a little bit about how you have to disclose that you're using ambient listening AI to your patients. Are most patients okay with using it?
And what's the reaction when you tell them that we're using it in the clinic? The vast majority of them have no problem with that based on my experience, the typical phrase that I use to let them know that. I'm going to be using this type of software is, I want to let you know that I will be using AI software with voice recognition for this visit to help with the documentation in order to be able
to better focus my full attention to you, and most of the people are fine with that.
Occasionally you might get a follow up question in terms of the nature of the software, but I've, um, haven't had any particular issues explaining that or getting people on board. Yeah, AI is everywhere, so I think. People are becoming a little bit more used to it and it's not surprising to them to see it in the clinics.
Now, Sam, I know you are fluent in Spanish, and you recently discovered that ambient listening AI can interpret and translate the notes. Can you tell us a little bit more about that experience? I had a conversation in Spanish with one of the patients, full conversation in Spanish. I didn't know the patient was only Spanish speaking, but I got into the room.
I had put plate and then I finish it. I came out. I put, you know, NI sat down and it was completely translated and it translated in an amazing way. Like no other interpreter has done that. Wow. That's awesome. So
that was amazing. And now every time that I have a Spanish speaking patient, you know, I mean, of course I use it but I think that's another, another amazing thing that, that this has, that was like a mind blowing, moment, particularly in Spanish. I think when they translate, it brings certain quotes that you will end up needing to change. If I say patient referred and then in quote. And of course then you would need to delete that 'cause you know, of course. Well, what a fun discovery. My next question for you is, how has ambient listening AI changed your clinic flow?
Do you do things differently now that you're using it? Yeah, I like to finish all the notes by the last patient and have at least everything that I can done. Like I was the type that I was like. Seeing the patient, I would start, you know, typing and so forth so I could be done. And then this has really brought me back to just being focused on the patient, talking to them directly and just have the app record that, which I have absolutely loved.
I.
That's great. I really liked what you said about bringing the attention back to the patient. I always feel guilty when I'm working on a note during the visit because I feel like I'm subtracting from my relationship with the patient. But at the same time, the end of the day can be really hectic and it can be really hard if you get behind on notes.
So I think that's a huge advantage of ambient listening AI is it lets us kind of forget about the paperwork and really bring our attention back to the patient. Now, I've heard both of you mention that. The ambient listening AI software has changed over time as you use it. So Nick, can you talk a little bit about how you give feedback to the software?
It is really responsive to feedback. So I do like to include one line of feedback after it's encounter, and essentially the AI software builds upon that. Let me share an example with you. For me, I like my review of systems to be summarized right after my HPI portion. So I just typed that as feedback to the AI software and then I just started doing that
for.
All of my encounters, which is super helpful. Yeah, that's awesome. I think the ability of the software to kinda learn and adapt is one of the huge things with ambient listening ai, because you know, if you were constantly having to change these things, I think it would be a little bit less useful to us moving forward.
Now, Dr. Sharma, I know you've been in practice for a little while and you've actually have really great templates that you use in clinic. I know 'cause I stole them from when I was on service, but how does Ambient listening AI help compared to the templates? Which one's easier for you to use? Yeah, I think it's a great question because I think that depends on how robust the template is.
There are things, thyroid, parathyroid, that are very easily templatable, including what you want from a quality perspective. What you want from a risk perspective, what you want from a counseling perspective. But then there are things that come just a little bit to the left or the right of this, which is like somebody with secondary hyperparathyroidism or somebody with graves and has also a malignancy.
At that time, the narrative and the discussion become a little
bit more I would say involved. And whenever there's a narrative issue for ai, it is just so much better. It's also better than it was. And how do I say this in a better way? Is that. What I used to think was not possible with ambient lessening in the early phase of using it like the first 20, 25 patients.
It's picking up on those nuances right now, and it was, some of it was critique, some of it was the corrections that I made, so it's picking up on a physical exam feature that I never had used for, but I was like, oh, there's a brewery in the patient's, or there's a palpable thrill in the patient's neck that's not part of the template.
I do measure that. I do look for that obviously in somebody with T Toxicosis and other things, and it picked up on that and I was like, oh, that should be part of our note. So it's kind of training me a little bit, not just me training it, but I think that when I have somebody with a more nuanced conversation, I.
So if you look at our template, you know the plan and the kind of the
actions that after a for a parathyroid patient are, we're gonna get a scan. We discuss that the imaging is meant for localization, not for diagnosis. Much of the imaging is not positive or doesn't localize, but still means you have the disease.
You still need the surgery. 80% of the tam, you're gonna have single gland disease. And I can go through all of this. But when the patient asks another question of Why did I get parathyroid disease? Is there a family history of this? There's some genomic issues here that's not in my template. AI just does a beautiful job, and those things are actually very important.
When the patient expectations aren't met, God forbid I fail on somebody. The things that the AI is documenting are in much more detail. The consent process that I'm doing at that process is the consent process and the elements of the consent. Beside that, we just discussed risks and benefits, which is usually what's in my template.
The elements of the consent is, has picked up much better with ambient listening. So it's a hybrid model for me. My templates have gotten a little bit better based upon what I saw
in AI too now, so it's a little bit less useful, but the minute I get somebody that's outside of that template, I think it shine.
Yeah, that's great. And you know, that's the promise of ambient listening AI is that it's picking up those nuances that we might forget when we get to the end of our day. And we're going through all the notes. And, you know, just to expand upon this, our first pilot was with ileostomy patients.
That was the first outpatient clinic pilot that we did. There were two surgeons involved. If we, I put up their notes right now from before ambient listening versus about a month into ambient listening. You will see a dramatic change in the ability to pick up nuances on the number of bowel movements, the type of character of the bowel movement, how concentrated it is.
How often are you having to rehydrate yourself? What times are you taking those medications? Those things were talked about in the previous notes, but now with the details and the nuances that actually are probably more actionable than just documenting the fact that somebody's taking their LA model, the patient reads it too, right?
All of her notes are
digital. All her notes are available for the patient. There is some reaffirmation of that. The PA does. The surgeon is actually caring a little bit more about details. Yeah, that's a great point. Well, it sounds I've heard a lot of good feedback on this, but where does Ambient listening AI go from here?
What needs to be improved? Do we think it's around to stay in the clinical setting, or do you think it's kind of gonna fade out? I think it's, at this point, it's definitely a time saver. I think of it as a, you know, helpful scribe at this point. That's, it's like readily available for everyone. And I do expect the accuracy of the output to improve with time as you know, the AI gets more updates and the reasoning of the AI improves.
Yeah, I'm gonna, I'm gonna echo some of the things that Nick said, but I mean, I think it's here to stay. I think it's here to get better, like from the time that we started using it to now, like it's just gotten better and I think it's just gonna continue to get better. And I think the one thing that you know, I mentioned was like the nuance of decision making when.
When you're talking about, you know, if you, if I'm seeing a
patient for an inguinal hernia, I think it's like straightforward the what we're gonna do. But when I'm seeing a patient for a, you know, redo paraesophageal hernia with other things there's more nuance on decision making. And I think that the challenge is that I'm not saying my thought process, but the only thing that we need to do for this to get better is just say it a once.
AI's gonna start listening to that. It's gonna be able to grab that information and create that output. One other perspective that's important, and again, I put that perspective on since I do quality, is that the stakeholder for a note isn't just the provider and the patient, the stakeholder for the note is medical-legal.
The stakeholder for the note is quality. The stakeholder for a note is billing purposes and also some level of, you know, the next provider. So if you are, and again, this is the wish, not the reality today, although some of this is the reality, is your billing improved? The return on investment isn't just that it's saving my time or your time, or is it that we are
actually doing a better job coding?
Are we doing a better job? Perhaps identifying pitfalls is this ability to augment what I know or knowledge management. I think more and more. We'll be using it to get other values out of it beside just my day-to-day life. Yeah, definitely. I think the next logical step for ambient listening AI is moving into the inpatient setting.
You know, there's also a lot of different notes to write in the inpatient setting, so progress notes are kind of different than clinic notes compared to, you know, consult notes, discharge summaries, all the variety that we do in the inpatient world. Nick, tell me how you've been seeing ambient listening, AI being used in the inpatient setting.
Now there is, I guess, some different challenges when it comes to the inpatient practice of medicine. You might have like more complex presentations, more complex problem lists. It would be really helpful to try and summarize prob problem by problem when you talk to the patient, either when you're, um, doing
your dictation to the ai and that could be useful when it comes to the organization and final output.
The other thing is that the. Inpatient setting can be more crowded. I haven't had any issues with navigating that when it comes to, you know, using the voice recognition software. And again, what's helpful is to really reiterate the patient's points, and essentially this shifts the focus of the AI to the pertinent information.
Another approach would be to provide feedback to the software and essentially direct it to the right information. I really liked your point about it being a little bit more, you know, crowded and disruptive in the inpatient setting. I know asking around about this, I've heard that some of the ED residents are using it for their notes in the emergency department, which also can be very loud and crazy.
So I'm sure it adapts to that over time. But I definitely think that could be a challenge. Just to clarify, 'cause I've never actually used ambient listening ai. So if your patient comes in with a family member and both are
talking, will it still identify that as history from the patient, or does it ever get confused with who's the provider in the room?
Yeah. In my experience, it, it can get confused. Yes. What I do, and it seems to be effective, is that I just, I get all the history. Then I just summarize talking like closer to microphone. It seems to be helpful in terms of helping the AI to filter in the appropriate information. Yeah, I think that's a great tip for new users who maybe will be struggling with those visits that have multiple people.
So I know Nick, we haven't used it much, but to summarize, what do we think about in the inpatient setting here to stay, you know, needs lots of improvement? Where do we think ambient listening AI is gonna go here? I think it's definitely here to stay. You know, there is unique challenges in the inpatient setting in terms of the complexity of the, um, clinical presentations.
Maybe some additional work is needed in terms of the clinical reasoning of the ai. Yeah, and I think back to Dr. Sharma's point as well, you know, sometimes,
especially for the surgeons, we're rounding so early, the lights are off in every room and all of a sudden everything starts to blur together. And you're like, I can't remember if they told me if they felt nauseated or someone else told me that.
And then our notes get really skimpy because we can't. Exactly remember what people said. So I think that's really gonna help extracting some of that data and you know, clarifying things to really get a more accurate representation of what the patients are telling us on rounds. Moving on to our next setting.
It should be no surprise that we end up in the operating room, since this is a surgery podcast, but I think this is probably the most exciting application of ambient listening AI that I've heard of. You know, I think it's a little bit easy to imagine how we can use it in clinic and even in the inpatient.
Setting, but dictating operating room notes can be a huge burden on the surgeon and can be a huge time suck after a long case. So, Dr. Sharma, how have you been using ambient listening AI in the operating room? Let me just tell you the first couple of ones we did and I think it'll kind of, we didn't know what we were gonna get.
We had no idea. We had some experience on the outpatient setting. No experience on the inpatient setting. So for the
first eight to 10 cases, I would say I just turned it on and let it go. And when I let it go, it generated a document that was very, very, very long. It was amazing the amount of information that came out of there, so things that came out that never go in the operative note, anesthesia issues, pressure issues, medication requests, the instruments that I was asking for and when I was asking for it.
The debrief over the specimen, over the procedure type and the potential change in procedures. I was planning a right and I did a total where I was planning a thyroidectomy and I did a thyroidectomy and a pair of those were the type of procedures I used in on. And so when I sat down with the analysts and we went through, they said, yeah, give us about eight or 10 notes.
Let it record everything. Let's go through it and see what you tell us is important, what is not important. And I was like, all of this is important just not for the accuracy of op note. It's
actually important for a bunch of other stakeholders, including like billing, I. Where cases were changed. It was important actually for serious events or missing equipment.
Like we, we got too much information, like we, I could not process where we would use it, but we said, let's just look at it from one perspective, which is the provider perspective of documenting an accurate op note. On average, about 20 to 25 items more were included in op note done by ambient listening than done by me.
And I had been doing the same operation for over 20 years. At that time, it was something that, you know, I don't know if it was me or if it was ambient listening, or the combination of it. So we then we tried in a couple other folks and it was about 40, 45 different events in the operating room that are important from an outcome standpoint and from a procedure standpoint.
It's not something that you just use and
it gets better. It actually gives you feedback. Of Samsung things that you need to document better. So I'll give you an example. The superior pole mobilization of the thyroid. It documented things like how I grasped it, what the texture of it was, whether I moved it medially, laterally inferiorly to figure out where the external branch of superior laryngeal nerve was.
Now, the reason why I was speaking these things is because I usually have a trainee or two and we're talking about, Hey, you should do it this way. No, bring that LigaSure this way. You need to move the superior pole down inferiorly so you can actually see the pedicle better. It picked up on all of those things, and in the beginning it picked up on an, it picked up on too many things.
I spend more time correcting it, but now it's not doing that. And so, you know, obviously this is anecdotal and we, we can't just say it's better for me versus better for others and things. But I don't know if you guys realize, but just today, the Mayo Clinic published their experience on comparing video of an operation to the operative dictated note
versus the ambient listening note.
And it was 20% more accurate. That means it picked up one extra point of an error that was present that the video had seen, that the surgeon had not documented. And so I think we're just on that curve very early on that curve of what it really means. It was fun to read those first eight or nine, eight or 10 or whatever we did.
It, they were, I mean, I've never seen a 28 page document come out of a, of, of an OR but we had a 28 page single space document. I was like, yeah, there's no way this will not work. I. But it is actually working. Yeah. Surgeon definitely didn't write that 28 page op note, but maybe there are some situations where being a little bit more verbose can be helpful.
My next question for you is a logistics one. So I know you said you were talking with the trainees during the case, and that's how the ambient listening AI picked up on the steps. What about the steps that we don't normally talk about? Did you have to dictate where you want your. Skin incision and how you want the skin closed and
some of those things that we do routinely that aren't necessarily talked about within the operating room.
It was kind of interesting, like when I do a laparoscopic adrenalectomy, the amount of things I verbalize are different and less than when I'm doing an open thyroidectomy or open thyroidectomy. I don't know why I, but it is, so the amount of information it picked up in an adrenal was a lot less. Than it picked up.
I usually, you know, trocar placement and things like that are things that I don't verbalize that I'm doing every single thing. I ask them put the trocar here. That's all I put. Picks up, but for somehow the neck incisions, we talk, we talk about each plane, we talk about each platysma layer, but let's mobilize the right side, the left side, maybe I'm just, that's just my nature and, but it definitely had different values for different operations.
I. I did it for an appendectomy and it came up with eight line load note. That was it. Oh wow. We put the trocar in, we found the appendix, we looked, put the other trocars in. We used the ligature, we used the stapler. We were done.
I promise you it had no more value than that for that. So I think I was probably also the middle of the night doing an appendectomy wasn't speaking as much, but it is only looking at verbal information.
There's no other connection. There's no video connection yet, but that's the different conversation for another day. I'm sure. Yeah. And that video might be, you know, a key piece as we move into the future with this. So I guess for now, with your, or notes, you know, templated versus ambient listening, AI versus just dictation after the case, which one do you prefer?
I think, I think that dictation still requires less correction for me. Sure. The templated one picks up on elements that I often will forget to dictate. So ambient is kind of the better world of both. It gets, I just need to figure out how to minimize my corrections with ambient or the reduction that I need to do with an ambient note.
I think that will get better, but it
still requires a little bit, and again as I think Sam said, it's not a lot of minutes, but it's about two to three minutes for me deleting a bunch of things that I don't want in there. But it has the right order. It has completely eliminated other conversations has removed colloquial words.
Don't wanna say which ones those are, but there are words that sometimes happen in the operating room that it just, not even in the first one, it was just never there. So I guess the ambient listening platform had already figured out how to filter those. Anesthesia conversation is not coming into the note anymore.
So I think it's evolving. But I think the first goal should not be that the first goal should be accuracy. Yeah. So our last question here, so what does, where does it go from here? Is it around to stay in the or do you think it needs some more improvements before we all start using it? So right now I'm using it on my phone.
A phone is not the best microphone in the operating room. So I think the technology, the hardware technology has to catch up. I think ORs are catching up already, by the way. They're ORs
already that have video and audio and other things build up. So it is a bigger room and then a further environment, a larger environment from a volume standpoint than what outpatient setting would be or what an inpatient setting would be.
So I think that technology has to marry it and there is an expense to that. But the phone works. The microphone works well. I also think that it will require a validation phase about how much time are we spending correcting things. And again, the paper that I just mentioned specifically talks about that.
And then last but not least, are we able to leverage automation out of that ambient listening document? So if it's still a manual thing for me, then I've not really finished my note yet. I've just gotten the raw information. If we can figure out, and I think we can, where we figure out how to automate the information coming out of there from a blood standpoint, from a sepsis standpoint, from a pressor
standpoint, from a fully removal standpoint, a fully was placed, it needs to be removed within 24 hours.
That's our quality goal. Those, I think those are all the next phases of taking that piece of information and it's a lot of information and making sure that we can make the next set of connections with it. That's, that's when you know something is here to stay when it has a multi-stakeholder advantage.
Yeah, the opportunities with this are really endless. And now, Sam, I know you had a really interesting idea about how we can use Ambient listening AI operative reports and use it to improve surgical education. So can you tell us a little bit about that? I don't know if this has been done or not, but I think one thing that I.
That it's worth looking into. I think we've all had those attendings that have steps down on a particular surgery that they know how to tell you, grab this instrument in this particular way, move it to the right and I think having that information over and over from like those attendings that just have an amazing, a way of
explaining.
Thanks. And having that in, you know, writing on an AI model, I think would be extremely helpful from a surgical education standpoint. That will be a very useful tool for trainees. That's a great point. Well, this was an exciting conversation. It's really sparked my excitement about the growing role of ambient listening AI and the vast.
Potential for the future. However, it'd be remiss of us not to consider the sustainability and ethical implications of ai. It's essential that we develop technology that enables us to use AI safely within the environment, also ensuring its ethical application. As with any emerging technology with, we must remain mindful, adaptable, and committed to ongoing learning.
That said, one thing is clear. The future of AI is incredibly promising. So thank you to our guests for sharing their experience and their thoughts today. And thank you to the listeners for tuning in. Until next time, dominate the dominated day, dominate the day.
Just think, one tiny step could transform your surgical journey!
Why not take that leap today?