Surgeons put a tremendous amount of focus on mastering the technical skills and maximizing their knowledge base. Data from the CMPA's medico-legal files suggest that a number of adverse events involving surgeons are related to communication issues, rather than to poor surgical technique or insufficient knowledge. This episode identifies techniques to help physicians working in surgery settings reduce their risk of adverse events in the OR.
Related CMPA Perspective article:
Can intraoperative decisions be diagnostic errors?
https://www.cmpa-acpm.ca/en/advice-publications/browse-articles/2020/can-intraoperative-decisions-be-diagnostic-errors
Announcer: You’re listening to CMPA: Practically Speaking.
Dr. Steven Bellemare: Hello everybody, Steven Bellemare here.
Dr. Yolanda Madarnas: Hi, it’s Yolanda Madarnas.
Steven: Yolanda, what are we going to talk about today?
Yolanda: So we know that surgeons put a tremendous amount of focus on mastering the technical skills and maximizing their knowledge base. But it’s also important to view that the data from the literature in our medico-legal files suggest that a number of problems are not so much related to poor surgical technique or insufficient knowledge, but actually communication issues.
Steven: I’m really glad that we picked that topic because, you know, the issue of diagnostic error isn’t at first glance, a big concern for surgeons, right? After all, the diagnosis is usually made before surgery.
Yolanda: But in fact, it is relevant to surgical practice.
Steven: As a listener, you might be inclined to tell yourself, “Well, I’m a nice person. I’m a good surgeon, so this podcast is not relevant to me.” Truth is, most surgeons in our file are not individuals with disruptive behaviour or poor interpersonal skills, right? The relationship issues can be the result of subtle, unappreciated behaviours. So, I’d encourage you to not go away and stick around and listen.
Yolanda: So on that note, let’s talk about what our take-home messages are going to be, Steven.
Steven: Well first one I would say is that it’s important to recognize that individual cognitive capacity is limited. And the use of cognitive reserves of the team can be a useful way to counteract that.
Yolanda: Second point, I would say, is the idea of contributing to a culture that allows the free-flow of important information within teams.
Steven: It’s absolutely crucial, I agree. And the third take-home point, of course, would be around documentation. And what we would like to say here is that documentation is a form of communication. It’s important to record important information, but actually review that documentation either before or after surgery, or both depending on the case.
Yolanda: Absolutely. So let’s talk about definitions. I would anticipate there are different definitions of diagnostic error.
Steven: Oh, absolutely. And some are based on diagnosis as the noun, as in the label we give to an illness.
Yolanda: And other definitions are based on diagnosis as the verb.
Steven: Oh, that’s right.
Yolanda: So the reasoning or the process of arriving at a given label. And in fact, the definition [of diagnostic error] from the Institute of Medicine involves both of these. So, to quote, “It’s the failure to establish an accurate and timely explanation of the patient’s health problem, or to communicate that explanation to the patient.”
Steven: Right. There’s no specific definition of diagnostic error in surgery per se, but for our purposes, the concept of diagnosis extends to all of the decisions and the choices made before, during, and after the surgery. You make a choice based on a reasoning process and you could argue that that choice is a diagnosis, right? You’re diagnosing an issue; therefore you’re going to act on it.
Yolanda: Yeah. So an example might be if in the course of an operation, we mistake a structure for another and we convince ourselves that there isn’t a problem, this could in fact, constitute a diagnostic error.
Steven: Right. So, you know, at CMPA, we’ve looked at our files in surgery and we’ve identified that most problems arise in the OR. And many, when they’re reviewed by the experts after the fact are actually felt to be preventable. Sometimes bad things happen, you know, and we realize that, and there’s nothing that you can do about those kinds of things, right?
Yolanda: Yeah. We’re human.
Steven: We’re human and not absolutely everything can be anticipated, but there are those times, though, when things are preventable and that’s really the focus of this podcast.
Yolanda: So let’s take an example, Steven, to illustrate this.
Steven: Sure.
Yolanda: A surgeon performing a cholecystectomy assisted by a third-year resident.
Steven: Okay.
Yolanda: It’s a difficult cholecystectomy with lots of adhesions. And the surgeon sees bilious fluid in the operative field and says, “We’ve probably perforated the gallbladder.” The R3 says, “I don’t see a hole in the gallbladder”, but nothing more.
Steven: Okay.
Yolanda: The surgeon remains focused on the operative field, continues with the procedure and at the end—after aspirating and suctioning the area—doesn’t see any evidence of persistent greenish fluid and is satisfied with his explanation. The R3 leaves it at that; doesn’t revisit the issue of the gallbladder and doesn’t speak up. Postoperatively, the patient does poorly, develops abdominal pain, becomes septic and is taken back to the operative room. And lo and behold, a perforation of the small bowel was discovered.
Steven: So that fits our diagnostic error definition, actually very well.
Yolanda: [signals agreement]
Steven: And our message here is not try harder or do better, right? It’s that these issues creep up on us as physicians when we get tunnelled in and we lose situational awareness, which is that ability to pick up on information, to process it and actually identify its meaning and to project into the future for what it might mean.
Yolanda: We know that hindsight is 20/20. But I’ve certainly spoken with surgeons who’ve told me that in retrospect they wish someone had spoken up with a concern, or they themselves had listened to their inner voice niggling away that something wasn’t quite right.
Steven: Exactly. Think about how often it’s happened in your career, where something happens and in retrospect it makes sense and someone says, “Oh, I had a feeling at that time” and you think to yourself, “Well then why didn’t you just say something?” Right?
Yolanda: [signals agreement]
Steven: That’s exactly what we’re trying to get to here.
Yolanda: So, this takes us to our first take-home point, Steven.
Steven: Right, and that’s the fact that individuals have limited cognitive capacity and teams have more than the individual. Now, cognitive capacity, of course, is that space in your brain, right? We can only handle a certain amount of information at any one time. Once that threshold is surpassed, something’s got to give, right?
Yolanda: [signals agreement]
Steven: We can’t handle it all.
Yolanda: And we recognize that there are a number of factors that could impact on our individual or our collective cognition and just to name a few, think of the number of people in a room, the noise levels, our individual stress and the situational stress.
Steven: That’s right, or the fact that we’re hungry, that we’re tired or that we’re running overtime on our OR list, right?
Yolanda: [signals agreement]
Steven: To only name a few. So when cognitive capacity goes down, so does situational awareness.
Yolanda: Yeah. So, this represents a really good opportunity to leverage the team to mitigate any one individual’s decline in cognitive capacity and recruit complimentary cognition and capacity from the members of the team.
Steven: Right and that’s when we talk about team situational awareness.
Yolanda: Yeah.
Steven: So practically, though, how can we do that?
Yolanda: So we’ve talked about this in another podcast, you know, you can use huddles and debriefs to demonstrate an openness to collaboration that we acknowledge that we can be vulnerable, we’re not perfect and that we are appreciative of someone stepping in to prevent a mistake from happening.
Steven: Right and that’s what the surgical safety checklist is all about in effect, right? Really, it’s a team debrief to help safeguard against diagnostic error. So, in addition, I don’t think we can overemphasize the importance of empowering speaking up. And speaking up is the ability to say what you think in a polite and professional way, of course, but at the time when it happens without fear.
Yolanda: Like our gallbladder case.
Steven: Exactly, right? So in our gallbladder example, the resident had some concerns about the gallbladder not being the source of the bile and he did mention it. He said, “Oh, there’s no hole. I don’t see a hole in the gallbladder” but only once and perhaps not necessarily in an effective way. And so, why didn’t he bring up the issue again. Maybe he perhaps asked to look at the specimen to see if there was a hole. Maybe he was uncomfortable doing that. Perhaps the surgeon had demonstrated in the past that he wasn’t particularly interested in receiving feedback from his assistants. So as a surgeon, what can you do to encourage people to share their perceptions with you, to actually speak up to you so that you’re not in that situation where something happens, a complication down the road and then you’re thinking to yourself, “Well then, why didn’t you just speak up to me?”
Yolanda: What if? Yeah. So, I remember a call with a member quite some time ago, actually, who recounted a story before the time of operative sites being routinely marked and how he was scrubbing into surgery with a medical student and the medical student raised the fact that he noticed a discrepancy between the history and the OR booking, vis-à-vis the side of the lesion. So one was right and one was left. And the surgeon actually listened to this, un-scrubbed, looked it up, wasn’t clear and actually woke up the patient to confirm what side they were operating on.
Steven: Imagine that, right?
Yolanda: Oh my goodness.
Steven: And some people would say, “Well gosh, that’s…”—to some people that’s almost unfathomable. Are we actually going to wake up the patient? But he did the right thing, right?
Yolanda: Absolutely.
Steven: And that’s a great example of a diagnostic error averted.
Yolanda: And a great example of speaking up and listening up.
Steven: And listening up.
Yolanda: In a culture where that’s encouraged and accepted.
Steven: Absolutely and what is in fact, a great link to our take-home point number three.
Yolanda: But what happened to take-home point number two? We’d be skipping ahead, but then again, it’s our podcast. So, we can take poetic license.
Steven: Right. Sure, why not? Let’s do away with the script, right? What I was intending to say, is that our take-home three was that documentation is a form of communication and that it’s important to record information, but to actually review it, because that’s the purpose of recording it, either in pre or postop or even both.
Yolanda: [signals agreement] and it’s important to remember that individuals may not have all of the information required to act upon and that different members of the team may actually harbour some of that information.
Steven: Or we may actually fail to appreciate the significance of information, but others might, and then may be in a position to provide a different viewpoint.
Yolanda: Like the R3 in our gallbladder case.
Steven: Right.
Yolanda: So I think what we’re trying to illustrate here is that diagnostic error, interoperatively, can be compounded by diagnostic error postoperatively and that documentation is a key factor in risk reduction.
Steven: Right.
Yolanda: Good versus insufficient documentation. So here, we see that diagnostic error can be compounded by inadequate documentation or mitigated by good documentation.
Steven: How the note is written may actually affect how a clinical presentation is interpreted downstream, right? So documenting uncertainty actually can be helpful in helping us, or others, manage the postop patient by allowing us all to get a possibility for a second chance to get it right if we did miss it initially. So for instance, if we’re not 100% sure that an explanation is what it is, i.e., we’re not 100% sure that the bilious fluid is coming from the gallbladder. Writing that down can actually help us revisit that in postop because it might actually escape our mind at that point.
Yolanda: Yeah. Problem is, though, is that we don’t always listen to that little voice and we’re able to convince ourselves that our explanation was actually good enough.
Steven: Well and that’s not just in surgery, right?
Yolanda: Absolutely.
Steven: Yeah. We see that all the time. For instance, you know, how many times have we been in M&M rounds, where we’ve heard cases of persistent this or persistent that, that eluded the diagnostician and that in the end, we find was due to something completely different that was completely unexpected.
Yolanda: Yep. Right? Like the pneumonia that isn’t resolving that’s in fact a subdiaphragmatic abscess.
Steven: Yeah. So recognizing that we’re all prone to magical thinking. None of us want to believe that we did something that could have caused harm, right? That can be our saving grace.
Yolanda: So that flows nicely into take-home point number two, Steven. How can we contribute to a culture that allows for free-flow of important information?
Steven: Yes. The creation of psychological safety, it’s so important in creating a high-performing team.
Yolanda: So you might see something as X, but the rest of the team sees that same thing as Y and together we all get a clearer picture.
Steven: But that’s only provided someone actually speaks up about it.
Yolanda: And someone actually listens.
Steven: So in the end, you know Yolanda, it’s about learning. It’s not about who’s right and who’s wrong. It’s not about who’s the boss and who’s not. It’s really about learning. A colleague of mine was telling me recently about how when he was doing a liver resection, he came across a situation like that. He had an excellent final year resident that he knew very well and they were having some troubles getting a margin and when they got to the final attachment, the surgeon used the linear cutting stapler to divide it. And just as he was finishing firing up the stapler, the resident said, “So why are we dividing the cava?” and then he froze and became presyncopal.
Yolanda: Oh, I can imagine.
Steven: And so he had divided the cava not knowingly and the resident had watched him do it. Hadn’t said anything until after it was done, even though it was clear in the resident’s mind that they were about to divide the cava. So the surgeons tells me, you know, he had two options: the usual blame himself for making a mistake and making a solid promise to himself to be more careful the next time and to do better.
Yolanda: Which is not an uncommon reflex, right?
Steven: Absolutely not. But there was also another option that he identified and said that he considered that there was something better that he could do. Like figure out why the resident was so hesitant to say anything and what elements of that might have played a role in his losing situational awareness at that time.
Yolanda: So this brings to light the concept of fixed versus growth mindset.
Steven: That’s right and never missing out on an opportunity to learn.
Yolanda: So someone who has a fixed mindset may tend to focus on their mistake and be discouraged, blame themselves or others and that fundamentally hinders the development of better processes to decrease diagnostic error.
Steven: Right. And the person with the growth mindset will rather take that as an opportunity to learn and improve. They accept the error. That’s not to say that they throw their hands up and say oh well, that’s life. But they accept the error as a feature of being human and then they move on and say so what can I learn from it and they take it to the next level.
Yolanda: Yeah, exactly. So for instance, a debrief with the team to lead by example and foster learning so that we identify ways to prevent a reoccurrence of events like this.
Steven: Right and in my friend’s case, that’s exactly what he did. He sat his team down afterward and said we need to look at this. How did I end up cutting the vena cava and how did it occur? He made it very open and people interjected with ideas and they identified that there had been a lot of unusual chatter and noise going on in the background and that that had distracted him from his task and that that likely had played a big role. So, they were able to make sure that the next time they kept the noise level down, especially at these crucial moments.
Yolanda: So I think it’s important to remind ourselves to leverage our documentation to help prevent an interoperative mishap from becoming a postoperative patient safety incident.
Steven: Yes, but you know what? We can document ‘til the cows come home. Someone actually has to read it.
Yolanda: Absolutely. So Steven, how about a communication tip?
Steven: Well I’d say that we should leverage daily events, like the surgical safety checklists, our huddles or debriefs, to signal to our team that we’re open to collaboration. That we know that as individuals, we’re vulnerable and that we appreciate someone preventing us from making a mistake that they can see coming.
Yolanda: And that brings us to the end of today’s podcast, Steven.
Steven: We hope this was helpful to you.
Yolanda: Thanks for joining us, today. I’m Yolanda Madarnas.
Steven: And I’m Steven Bellemare, reminding you that if you have any questions, you can certainly call us at the Association and if you have comments or suggestions for future topics, send them along.
Yolanda: You can email us.
Steven: Our address is podcasts@cmpa.org.
Yolanda: And remember, when you change the way you look at things…
Steven: …the things you look at change.
Announcer: These learning materials are for general educational purposes only, and are not intended to provide professional medical or legal advice, nor to constitute a “standard of care” for Canadian health care providers.