Authenticity and the EPA: Resident Perspectives

By: Eusang Ahn (@kookysang), Kori A. LaDonna (@Kori_LaDonna), Jeffrey M. Landreville, Rawad Mcheimech, Warren J. Cheung (@wjcheungem).

Building a Mosaic

EPAs, or Entrustable Professional Activities, are the basic building blocks of assessment within Competency By Design, which is the Canadian specialist model of Competency Based Medical Education. When it comes to resident perspectives on EPAs, authenticity is everything. We found this out when we conducted a qualitative study that asked our resident participants:

  • Is the sum total of your EPA data reflective of your lived experiences and clinical skillset?
  • What impact on learning do EPAs have?
  • What’s your day-to-day experience with EPAs?
  • What’s your program culture like, when it comes to EPAs?

Photo by Jean-Philippe Delberghe on Unsplash

In theory, EPAs are supposed to serve as a collection of observations that put together a cohesive high-resolution picture from many little data points or pixels, much like a mosaic. In practice, however, the EPA experience can generate a scattered, incomplete mosaic of learner performance that is not as accurate, or as authentic as the actual lived experience.

Photo by Kelly McCrimmon on Unsplash

The residents from our study were largely split down the middle and fell into one of two quite polarized categories: those who believed the EPA system was working, and those who didn’t.

When The System Doesn’t Seem To Be Working

In the group who didn’t feel the system was working, many felt EPAs were communicated by their program as just a numbers game, or a checklist that needs all its boxes ticked to pass to the next stage of training. This perception translated into a highly selective collection pattern of EPAs, i.e. the ‘bad kind of cherry picking’, where residents would only seek EPAs retroactively once they felt they had already done well on a task. This was often in response to feeling that the assessment mechanism was a pass-or-fail system, regardless of whether that was true.

Furthermore, residents were more prone to play this numbers game if they felt their faculty was hadn’t fully bought in to the system. The EPA collection process itself was described as being tedious, where asking for an EPA felt like asking for a favor. All of this was worsened by a pervasive culture of “service over learning,” where teaching, and by extension assessment and feedback, took a second seat to the daily duties of the profession.

What ended up happening for many residents in this group was that the EPA became not much more than a self-assessment tool, that was asked for, triggered by, and filled out by the learner, with the faculty tending to sign off or make cursory edits at the most.

When It Does Seem To Be Working

In the other group, which felt the EPA system was working, they believed an accurate, high resolution picture of their lived experiences was achieved through the use of EPAs as low-stakes observations. They truly felt the main goal of the EPA was a formative rather than summative assessment – the EPAs seemed to function as intended, eliciting actionable feedback to the learner. In these programs, residents often alluded to a theme of strong faculty training, which tended to lead to more perceived faculty buy-in.

And with more training and more buy-in, staff would often pre-emptively ask about and offer EPA assessments, even before the resident brought the issue up. They furthermore provided these assessments in a more timely manner, which was universally felt to be more valuable. In turn, the workplace culture seemed to pivot away from “service over learning,” as faculty were felt to consider daily EPA assessments an expectation and a duty.

Residents from this group also shared the belief in the need for reciprocity of responsibility – meaning that for the EPA system to work, this would require a fully engaged and goal-oriented learner. These residents would often participate in the ‘good kind of cherry picking,’ i.e. actively seeking out EPAs in which they felt they needed improvement, rather than putting on a show to pass. This only happened when they felt supported and safe, which is why even the most mature and self-directed learner was still at the mercy of having an equally engaged faculty supervisor and program administration.

Reclaiming Authenticity

The concern in much of the literature prior to implementation had largely been about a loss of authenticity through reductionism, or the oversimplification of complex professions into unrealistically small bite-size pieces that wouldn’t accurately reflect the whole. However, the authenticity problem we uncovered was, and still is, much more complex than mere degree of granularity. The authenticity problem at hand is in fact driven by the dynamic interrelationships between the 3 key stakeholders, and their relative function (or dysfunction!).

The problem is not with the EPA itself, but rather with the difficulty faced by the 3 key stakeholders tasked with collecting and putting the pieces together, to form a cohesive, authentic mosaic.

Finally, an unexpected finding for us was that many residents expressed a sort of catharsis, or relief, towards the end of the interviews, at the opportunity to express their thoughts, air grievances and offer suggestions. Most of these residents told us they didn’t have a satisfactory (or safe!) method of providing upstream feedback to their respective programs, which is obviously troubling.

Photo by Matthew Lancaster on Unsplash

And so with that, I am calling upon all of you as leaders in medical education to reclaim authenticity in your own CBME programs by recognizing, valuing and re-assessing the interdependent relationships between the key stakeholders on an ongoing basis. The EPA system is only as strong as the weakest link in the chain, and the system must be firing on all 3 cylinders to not only just work, but to provide true added value to the teaching and learning experience.

About the authors:

Eusang Ahn, MD, MS(MedEd), Dipl. KSEM, FRCPC is currently an Attending Physician with the Department of Emergency Medicine at the University of Ottawa, and a medical educator. As a former advertising account executive, language instructor and current father of twins, he has a strong interest in communication and its role in education. Prior to coming to Canada, he completed a separate residency in Emergency Medicine and was an independently practicing staff physician in Seoul, Korea. Eusang aims to draw on his previous experiences to specialize in cross-cultural dissemination of best practices in health professions education and clinical practice.

Kori A. LaDonna, PhD is a PhD-trained education specialist and Associate Professor in the Department of Innovation in Medical Education (DIME) and in the Department of Medicine at the University of Ottawa, Ottawa, Canada. She is also a Lead – Qualitative Education Research in the Faculty of Medicine, University of Ottawa, and has a a strong interest in trainee wellness and health advocacy.

Jeffrey M. Landreville, MD, MMed, FRCPC is an Attending Physician, an Assistant Professor and is the Program Director in the Department of Emergency Medicine at the University of Ottawa. Dr. Landreville completed a fellowship in Medical Education through the Department of Innovation in Medical Education at the University of Ottawa and a Master’s in Medical Education through the University of Dundee, Scotland. He is currently a Clinician Investigator with the Ottawa Hospital Research Institute and has research interests in observation, entrustment, coaching and competency-based medical education.

Rawad Mcheimech, BA is a Research Coordinator in the Department of Innovation in Medical Education (DIME) at the University of Ottawa.

Warren J. Cheung, MD, MMEd, FRCPC, DRCPSC is an Associate Director of Education Innovation, an Associate Professor, and is the Director of Assessment in the Department of Emergency Medicine at the University of Ottawa.

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The University of Ottawa . For more details on our site disclaimers, please see our ‘About’ page