

Hello and welcome to this Behind the Knife episode in surgical education. We're the general surgery education team from Cleveland Clinic. I'm Nicole Brooks, a general surgery resident and current surgical education research fellow. And I'm Judith French, I'm the PhD education scientist for the Department of General Surgery.
I'm Jeremy Libman. I'm the DIO and Director of Graduate Education here. On today's episode, we'll discuss how precision education can be used to advance our approach to medical education. Precision education aims to provide the best education to the right trainee at the right time. It uses a systematic approach to integrate longitudinal data and analytics to drive interventions that address the needs of an individual learner in a continuous manner.
With the recent expansion of competency based surgical education through EPAs, precision education offers a system that is further individualized. This new concept can be overwhelming to grasp and imagine how we can implement it into our practice. Today, we're joined by an expert in the
field of precision medical education to introduce us to this exciting concept.
Dr. Jesse Burke Rafel is an assistant professor and hospitalist in the Department of Medicine at NYU Grossman School of Medicine. He also serves as the Director of Research at the NYU Institute for Innovations in Medical Education. He completed medical school at the University of Michigan prior to internal medicine residency at NYU.
Dr. Burke Raphael is a leading expert in the field of precision medical education. He leads the Precision and Translational Medical Education Laboratory, where research at the intersection of informatics, artificial intelligence, health services, and medical education explores how the educational performance of trainees map from undergraduate and graduate medical education to clinical outcomes.
His work bridges the classroom to bedside divide, aiming to understand the relationship between training and clinical care to develop evidence based precision training programs with educational interventions that improve
patient care and provide continuous improvement in training. We're excited to welcome you to the show.
Wonderful to be here. Very nice introduction. And this is going to be super fun. I'm excited for this conversation. Well, thanks again for being here, Jesse. We really appreciate it. And like I said, we've all been admirers of your work for some time. Just all we're reading the many articles you put out in academic medicine on this topic.
And we'll put those links in the website for folks to look at for some more background. But, you know, I think people have heard of precision medicine. But what is this idea of precision education? And maybe for the purposes of this podcast, we'll call it precision surgical education. Can you talk about what is this?
And give us some examples of how it really comes to be for sure. Yeah, I mean, you hit the nail on the head and that this idea is. build on the shoulders of giants on the idea of precision medicine. So that very idea of how can we tailor our medical
interventions and care to the patient in front of us, what would that look like in education?
And you know, to be honest, again, we're building on the shoulders of Other higher ed and other spaces that have thought a lot about individualizing to the single learner for quite some time, and I would say that because of the training structures we have in medicine, cross medical school, residency, fellowship and beyond.
There has been sort of a lag to get that sort of moving from the cohort level to the individual level. So, I think, you know, Nicole, you know, really said it well. It's that sort of right education, right learner, right time idea. And I think listeners may be like, well, individualization isn't anything fresh and new and different.
And I totally get that. And I think we have to recognize we've been personalizing learning programs for a long time. But. This idea of creating an actual integrated system, you know, where we use data and
analytics to drive interventions at the level of individuals and then measure the outcomes, whether they're educational or your professional outcomes, but also patient outcomes, and then close the loop on the whole thing to actually figure out what works.
What doesn't work. That has not really been realized, as I'm sure we'll talk about, and so that's our vision, is to try to build that full thickness loop, you know, sort of ignoring or transcending, ideally, these arbitrary, you know, transition points from medical school into your surgical residency, for example, or surgery out into practice, and to really make education more efficient.
We see that there's a need for more efficiency as we see, for example, accelerated pathways in medical school demand more efficiency. When we think about surgeons, you know, wonderful work coming out of Michigan from Brian George's group and Indicrom and others and Justin Dimmock, you know, that looking at some of the simple data and, you know, how folks get.
to competence. That word was thrown out, and we'll talk about it more, but I think there's really an important idea of, well, there's a limited number of surgery cases you can do in training, similar limited number of diagnoses I can see. So how do we optimize that? How do we optimize for not just at the cohort level, but at the individual level, thinking about their performance in cases, across cases for different patient populations, et cetera.
So that's the impetus. The how is going to be. A whole nother question. Maybe we could back up a little bit, because like I said, we've all geeked out on all of your research and studies and you know, we love this stuff. But maybe you could give us an example of a core clerkship student. How would they experience this, and then how does it help them as they transition into residency and practice?
As you noted. Yeah, sure. So we have a number of pilot innovations here at NYU Grosvenor School of Medicine, you know, one example would be
this Nudge project we're working on with Mark Triola, who runs the Institute for Innovations in MedEd. This is a project that basically asks the question, could we get educational learning resources in front of trainees, whether students or residents, and we've done innovation on both sides, at the right place in the right time?
And so. It surveys for notes that our students are writing or drafting or engagement in the chart to attribute them to an individual patient and then says, you know, and you can imagine on the surgical rotation other ways to do attribution. And how would you find which patient they're caring for? Is it log data, whatever, but figure out the patients that someone is caring for, or is on the team with and then understand then what does that mean?
Patient in the hospital for in our case, I'm a hospitalist. So different diagnoses, perhaps in surgery, what was the actual surgery and you can imagine then triggering different resources. And so, in our case, we trigger things like podcasts, actually, and I am
space and I am clerkships. Our residents get podcasts related to that in the student space.
They get podcasts across all the clerkships just based on the diagnosis codes of the patients. They also get. Question being questions that are tailored to that. They get other resources. The real question, though, is, well, is it doing anything? And how do we build this loop? And is it, should we be giving resources of something they just saw?
Or should it actually be the thing they've never seen? And it's been so long since they saw, but we think it's so foundational or even a step further. What if we had an actual data model to say this particular training based on their prior exam performance by domain? Based on their prior exposure to patients.
I think this is the right resource right now. So just to say we have some sophistication I think it's really cool that we're using the EHR to drive Low stakes sort of we call them nudges to do fun activities or to do reading But I think there's
even more sophistication to be had. So that's just one example of sort of in the most like tangible space, which is resources.
But you can imagine it could be something very different. So EPA rollout, for example, what if every time a patient left to the O. R. That was the trigger for the work based assessment and the O. P. A. The observable pressure activity assessment or the simple assessment, you know, linking those together, linking air to our assessments to our simulation lab to our educational resources.
Reducing the human burden to trigger those things. That is some of the promise there. So conceptual frameworks and learning theories, they guide precision medical education to ensure the data that is gathered is linked to outcomes. Can you expand on the importance of considering these frameworks when developing precision medical education approaches?
Yeah, I mean, again, I like to say we're building on the shoulders of other things because
you really should not reinvent the wheel or try to. That would be tough. And so it's so critical when thinking about particularly implementation, I think, to think about existing frameworks. And so if someone is thinking, just heard my little spiel on PME or precision ed, Said, wait a second.
That sounds like a P. D. S. A. Cycle. Well, yeah, absolutely. You know, we thought about P. D. S. A. And Q. I. Work and how that could inform that we thought about the master adaptive learner model, which is a model for how adult learners kind of identify gaps and learn. We thought about C. B. M. E. And competency based medical education companies, a surgical education, certainly so building on those theories.
Some of them are theories. Some of them are frameworks. I think in the analytics space, thinking about there's existing maturity models that exist in, in higher ed and, you know, about how institutions can kind of gain maturity moving from sort of your traditional looking backwards analytics to your
more the future looking analytics, predictive analytics.
And then. You know, like I mentioned in implementation, there's so much out there, whether it's implementation science and consolidated framework for implementation research, or if it's, you know, participatory research ideas and co production, whether it's nudge strategies like we deploy, always the idea with frameworks and theories is to choose the right one, choose one that's fit for your purpose.
Sometimes we may need to develop new theory, and I anticipate it. Precision Ed will stimulate thoughts around, well, how, what is the theory of learning around teams, for example, and context and some of the work that Stephanie Seabock Seyer is doing around, you know, interdependence and thinking about those questions.
So there are spaces where theory, I think, is needs to be developed and is being developed, but there are other places where we're certainly building on existing theory and concepts. So persistent education relies a lot on what data we're gathering, or the inputs that are kind of driving the
cycle forward.
How do we decide what those inputs should be, and who should be involved in those decisions? Yeah, I mean, such an important question, because garbage in, garbage out, put it a little less formally, right? And I think this is where Trainees, when I introduce this idea are pretty like, well, wait a second, you know, what are you using to make models?
What are you using to do analytics? And so I think it comes back to this idea that again, I sort of mentioned, but I'll bring to the fore here of co production and co creation that you need broad stakeholders. And so those may be the learners often that will be learners or trainees in the room, but also educators.
And then sometimes patients, if you're measuring something that affects patients or the intervention might affect patients, that's the end goal. Why aren't patients in the room when we think about this and what matters? So I think we need to think more broadly in medical education about who belongs in the room.
And it's probably a longer list than our usual sort of community of practice. So broadening those communities of
practice. I think there's this, I won't do it justice, but there was a story of the drunken person who's like loses his keys and the police officer comes over and he's looking under the street light at the side of the road and says, what are you looking for?
He's like. My keys, you know, why are you looking there? Well, that's where the light is. And so, the idea is, we should not only just look where the light is, what we already have, we need to think about what we're designing for. And that's the core innovation, I think, of CBME and competency based surgical education.
It starts at what you want to get to the end. You know, you want to get to every graduate being ready to provide some certain standard of care. And that excellence is provided to every patient. So it starts with the patient care and the sad reality is that right now we are not meeting that standard across everyone and multifactorial lots of issues.
But when we think about what it is on the input side, then we
need to think about both. Yep. What do we have? What's out there? How do we link it? How do we deal with these? Continuity issues across the transitions, but how do we design other fit for purpose stuff? What is traditionally under assessed and I think we traditionally have really strong assessment of medical knowledge but maybe less so of technical skills and communication skills, for example, and how could we develop use technology and other Things to increase the density of the data.
We have really the fidelity density volume of data is necessary to have precision, right? It's a power issue. If you have one data point doesn't really help you too much. So how do we increase the density of all that data without overburdening assessors and people? And that's what Canada ran into when they did competency by design, which is the Canadian effort to sort of do competency based education with EPA.
I heard that was mentioned and that it sounds like surgery is moving that direction and that's wonderful, but I think we have to be very attuned to not just saying, okay, let's do
10 times more assessments that require human observation. That is not going to get us to precision education. So those are some of my thoughts about how we should think about data and data gathering and who should be in the room.
All right. So one of the things that is near and dear to my PhD heart is validity evidence. And you're talking about a lot of different systems coming together. So how do we go about gathering that validity evidence for all of these systems? Well, that is, that's awesome. I mean, I think validity evidence is key and a lot of the stuff I've described is more educational Q.
I. where we're doing rapid iteration cycles and prototyping. I think as we think about integrating these tools and these cycles that the. Outcomes and the actual process itself needs to be assessed for validity. So collecting validity evidence of different types, whether you want to use, like, a message framework or a
game framework, but, you know, thinking about it within a program of assessment.
How does it help? That program of assessment, thinking about more sophisticated assessment designs would be really fun. So we've been thinking about crossover design and quasi experimental designs and rapid R. C. T. S. When you move out of the human world to some of these. Interventions are fairly electronic and based.
Well, you can test that in parallel and provide different interventions for different trainees as long as it's covered by. And so it offers some cool opportunities for sort of gathering validity evidence in new and different ways. But it is hard. It is very much. We get evidence that is very low level currently.
So things like usage and satisfaction, but making the linkage to what we really care about of outcomes is hard in a complex milieu. But I will say that as we start to think about measures that are measuring
data and behaviors that are, say. More attributed to an individual trainee. So in the resident space their surgical case performance, surgical videos, you can think about haptic work that Carla Pew does or in the I am space thinking about ordering behaviors and resident sensitive quality measures to use a term from Dan Schumacher and colleagues as we think about measures that are get down to the individual trainee.
Then I think we are going to have some measures that we can say, okay, we think we're able to actually measure the outcomes. We care about patient care related outcomes at the level of individual trainee. So then when we deploy said intervention to address X thing. Let's look at how that needle moves.
So it's a few steps figuring out what needle we want to move and moving to higher level, sort of higher Kirkpatrick level outcomes. But it's also tying it all together and bringing it back and say, okay, let's iterate. So we've mentioned competency based
medical education and a recent movement towards that in surgery with the emphasis of EPAs.
How does precision medical education complement a competency based framework? Yeah, I mean, you know, I think precision education is very complementary. It is Building on that idea again of patient care being the key outcome that we have a criteria for care that we care about, I think folks sometimes conflate CVME or CVE with time variable progression, which is not the case.
And I think time variable progression has struggled to kind of take off in the United States. You have seen promotion in place efforts and pre attendingship things, but I would encourage programs to not feel constrained by external forces and actually take off. Think about where are their degrees of freedom for more precision, and that's what precision education helps folks do.
So when we think about precision education and get away somewhat from. Okay, well, you know, that's all well and good, but I can't
promote this resident 6 months early because I have a roster of folks that need to be cared for. We've been thinking, well, okay, but what's the degree of freedom there? Where is the precision as far as in flexibility in the clinical spaces, the clinical learning environment?
And how could we get more density about that training to really know whether they're even competent or reached a criterion threshold on the competency curves? So that's where it's gotten really fun. So I think they compliment each other and that we're sort of saying, okay, well, on the one hand, CBME hinges on having this.
Precision of dense assessment data at the level of individual learners, but that has been very difficult to actually achieve with human based assessments and direct observation. So I have a lot of admiration for CVME, but I also have felt sort of like as someone who dutifully fills out their human evaluations and I'm filling them out and I'm at the same time thinking Gosh, I don't know.
This is hard for me to do. Well,
you know, I think there's just so much work in the assessment space. And so all to say, I think that PME as a tool or as an approach could fuel CBME further. And again, I think it will hopefully encourage folks to think about degrees of freedom. So in surgery, what would be the next case, for example, based on the prior cases you've seen in your prior performance?
How do we bring analytics in to help with that? So all of this is heavily reliant on algorithms and machine learning and AI, and it, forgive me if I'm wrong, this is beyond just chat TPT, right? Yeah, yeah, I would say, I would not consider Brazilian education to be synonymous with AI. I mean, analytics is definitely a piece of this, I've focused on sort of the human less in the loop.
Pieces of this just because that's probably the most unfamiliar to folks. Like it does not obviate the need for doing EPA or constantly based, you know, milestone assessments, et cetera. Those direct
observation are still part of programs of assessment and part of the analytic plan. So the data coming in and the analytics.
But yes, I think it will require AI, including generative AI, but not exclusive to it. And we've all seen news stories and perhaps even experienced the problems with equity and bias with some of these systems with some really undershooting and some overshooting and creating problems with this. So, you know, we're at a great opportunity.
Position right now, where this is all developing. So how do we make sure that there's transparency and equity is maximized as these systems are developed? Absolutely. So I think there's, there are risks for sure. And anytime we, we think about moving to sort of more augmented intelligence, you know, AI and other tools, Analytics, augmenting our approaches.
This particularly concerning when you think about that. If you're
training algorithms on training data, if there is underrepresented data where there's historically underrepresented groups in medicine who do not have a lot of data in the data set. And you're not attuned to that. You could very well train models that perpetuate or exacerbate biases.
So I think there is going to be real attention here to is this turning the needle in a positive direction or negative direction? I would say there are a few pieces to sort of help folks. Be attuned to this 1 is just the learner coproduction, having learners in the room who will call you out so quickly on.
Wait a 2nd. Did you think about this? Wait a 2nd. We're not capturing that. So that coproduction piece is key. I think having more diverse educational teams and design teams is also key. So let's not ask for feedback after we've designed the algorithm. Let's bring members of the team in that we think represent the diversity in which we want to have
stakeholder opinion from.
So we have an institute for excellence and health equity here that isn't focused on educational equity. It's focused on health equity and health disparities. But have been incredible assets for thinking about algorithms and other tools here at and why you go to school medicine that attention to this topic and surveillance is like another key one.
So algorithms can be designed in a way that they aren't biased initially, but then over time can drift or can change or other data can change and context can change. And so I think. Again, that idea of just being attuned to this is going to be key in our experience that there have been instances where AI has been really actually quite helpful for understanding existing inequities in a way that we just didn't understand them before.
So, I think it's important to go into this with equipoise of like, it certainly there is risk of harm, but there's also a real potential for
benefit. And I guess I'm a tech optimist in a bit, but, you know, I think. Our experiences of, for example, in the admission space where we designed AI algorithms to think about screening residency applicants using both structured data and natural language processing of unstructured data, what that project did was, yes, built a model to help us.
Think about how we do screening, but actually it helped us understand our existing process, what was baked into it. And sure, we had some like broad stroke heuristics of what we value, what is alignment for us in the selection process, but it put to numbers what we were actually doing and where there were areas for opportunity and where there wasn't bias, where there was actually really fantastic.
Behaviors from our program directors, et cetera. So I think you don't know until you measure and I think there is opportunity on the flip side to actually counter and address
existing and equities. And that's the status quo is currently an equitable system. So I think we need to just acknowledge that too.
So in our current state of medical education, the data that comes in, it's often very siloed, like it's within one institution, or it's within the program, or a national organization, and there's not a lot of sharing of that data that goes on. So how do you envision that we can create a multi directional framework?
for this PME system to work? Yeah, it's going to be tough. I am optimistic that we can make progress. I don't think it's like we can flip the switch. I think the different levels, so collaboratories at level institutions we've seen this with some of the AMA, Accelerating Change Initiatives, where multiple institutions came together to do an innovation project in the GME space, for example, and share data.
And I think building on that idea of sort of, let's not boil the ocean for all surgical education programs with maybe a
handful in the Midwest want to come together and work on something together and think about how we share models or share information and de identify data, et cetera. So, I think not trying to buy the ocean, but moving beyond single institutions will be really important work in the next 5 years.
I think other organizational stakeholders. I mentioned the AMA. Double A. M. C. And others are gonna be key. So sometimes it might make more sense for simple, for example, out of Michigan or a C. G. M. E. to be holding the data and doing the analytics and thinking about how do we provide that back to programs and do feedback loops there?
So I am. I am bullish that there will be. Areas of innovation. I think we've seen some momentum in that space. I think when we think about across the transition points that are really painful because of sort of deficit of trust during the process, that's a place where specialties like OB GYN have made real strides and have decided to do their own thing in a way to be able to have a
better continuum of data, to be able to share data across the continuum.
Think about coaching and loops there at those handovers. So I think that's all exciting. And then. EHR vendors, Epic and Cerner are sort of the elephant in the room historically have been mostly focused on billing and sort of operational needs, but I'm hopeful that in the next 10 years, there will be more interest in sort of educational needs, especially as we link education to clinical care and more substantial ways.
So, speaking of the future, what do you envision precision 10 years? Hard to say. I hope there's broad buy in. That's certainly what we're aiming for. I think we really want real impact and, you know, different people to say what impact is in this space differently. We would love to move from pilots, for example, to real systems.
So I've described a few small innovations here. But what would a system of PME at a medical school or across a medical school
and their residencies look like? Look like, and how do we then scale that more nationally? So I hope to see systems of data sharing standards and data sharing things like federated analytics, which is where analytics happen at the institutional level.
But then they're sharing of the analyzed data up at some centralized level. I think there's gonna be everywhere. We're seeing that in the clinical space. So it's. Jenny, I has really changed the game, and it will be everywhere clinically. And so it stands to reason that I think it will be everywhere educationally as well.
And it does align very well with the precision education idea. So hopefully impact, hopefully changing how we do education, making it more efficient, making it more tailored to each individual learner. But also, I really hope it doesn't just lead to more stress on assessors, more burden on, you know, program administrators, in fact, less.
I think it will give us more
density of data, more density of insight really is what we're aiming for about each individual learner, but not at the cost of sort of increasing burden on assessment and on people who are doing the measurement. This has been a phenomenal conversation, Jesse. We're really, really grateful for you to take your time and share this with us.
We like to leave our listeners with few key takeaways. We call it our educational timeout in the operating room at the beginning of the case, you have to do a timeout, and we always take a break for education to talk about. One of the main things that we're going to work on in the case, what are our goals?
So what are three things that our listeners should take away from precision medical education as they move on? Yeah. I mean, I think this paradigm is here and I think it will grow because data AI based analytics outcome assessment, you know, linking education to patient care is here. So I think it's worth sort of engaging with this paradigm.
So that's 1st point.
Like, 2nd point is, you know, we sort of discuss, well, is this something new? Is this, does this add anything? I really do think it will help bring to fruition in a more meaningful way ideas like CBME or company based surgical education, master adaptive learner ideas, coaching models, et cetera, and bring those to really a tangible fruition more widely.
for having me. And then the 3rd thing, I guess I would say is like, this is really fun. So, you know, for folks who are like, that sounds kind of interesting, you know, like, where do I get started though? You just get started and start small and think about what data you have and what technology you can leverage building on sort of existing it.
You know, what analytics, sophistication you have and build from there, developing literacy and thinking about governance and culture around this kind of work where it's very co production co creation with learners, patients and educators. So just. Get started and Jenny, I can be this great level center
for trainees, for example, during their surgical research years to get involved in an educational innovation projects and precision education projects.
So I can't wait to collaborate with folks from the surgical community and see where this all goes. Well, thank you. Thank you so very much. We really appreciate you taking the time to be here with us today. Of course.
Just think, one tiny step could transform your surgical journey!
Why not take that leap today?