Search for a command to run...
Ask any program director: Choosing an orthopaedic resident is becoming more and more complicated. The number of applicants is increasing, as is the number of quality candidates. Although one could argue that this is a good problem to have, programs commonly receive—and so must sift through—more than 700 applications [10] just to select candidates to interview. Since it’s hard to do that fairly, the result is a large number of candidates to interview, often 60 to 80 [10]. Programs do their best to select and rank medical students, using letters of recommendation, grades from the school, AOA membership, class standing, research, quality of the school, and United States Medical Licensing Examination (USMLE) scores [1, 10, 14]. Letters are especially useless; good evidence suggests that they are not interpreted by readers in ways that the letter-writers intended [6], and there is ferocious “grade inflation” in orthopaedic letters of recommendation. A recent study found that more than one-third of orthopaedic applicants are rated by letters as being in the top 10% [12]. For that reason, test scores are a big part of the process; about 90% of orthopaedic program directors use the USMLE Step 1 score as one aspect for selection of medical students [1]. This is no accident: The three-digit USMLE score correlates with successful passage of board certification examinations in orthopaedic surgery, neurosurgery, internal medicine, general surgery, emergency medicine, and otolaryngology [4, 5, 7, 9, 13]. While USMLE scores are one of the few objective measures available, they were originally intended to provide an objective national assessment for medical students and residents for state licensure. They were not intended to be used for resident selection. But with few objective measures available, medical students gradually accepted the perception that USMLE scores are of greater importance than other available measures. Today, medical students are devoting more time and effort into increasing their numeric scores as a way to distinguish themselves from their fellow residency candidates [11, 16]. Their efforts are paying off, at least in terms getting better scores: The average USMLE Step 1 scores attained by successful orthopaedic surgery applicants has increased over the last several years [8, 14, 15]. But educators are concerned. In their view, USMLE scores do not assess other important traits to consider, such as motor skills or professionalism [1, 10]. And recently, after a brief public review, the National Board of Medical Examiners (NBME), which administers the USMLE, and the Federation of State Medical Boards (FSMB) agreed to change the reporting of USMLE scores from the three-digit numerical score to pass/fail [11, 16]. The decision was made to “reduce some of the current overemphasis on USMLE performance” [16]. The American Medical Association agreed with the decision, noting that the current residency selection process is “causing significant distress for our students” and that going to pass/fail will help ameliorate that stress [11]. Why the NBME and FSMB Made a Bad Decision I believe the NBME and FSMB made a mistake, and the hasty rollout of this change will only exaggerate the harms this decision will cause to our students. By putting time pressure on programs—the plan now is to move to a pass/fail system this year—the NBME hopes this will force programs into rapid revisions of their selection processes. But hope is not a practical strategy for residency programs. In my view, and in my conversations with program directors in other specialties (I’ve heard broad agreement on this), we are still miles away from developing an objective measure of medical student performance to replace the USMLE Step 1 scores that correlates to resident and eventual practitioner performance. Anything to help make the selection process more objective would be of value to residency programs. But before removing one of the few relatively objective tools for resident selection, the Accreditation Council for Graduate Medical Education (ACGME), American Board of Medical Specialties (ABMS), and Liaison Committee on Medical Education (LCME) should sponsor the study of useful objective metrics in medical student assessment. By removing the numeric Step 1 score now, without additional measures, the FSMB and NBME are only making resident selection more onerous and subjective. As discussed previously in this space [2, 3], residency programs are burdened with requests from the ACGME for additional documentation and structure, often unaccompanied by additional resources. The ACGME’s “six competencies” is a good example of this type of unfunded mandate. Programs were forced to develop their own assessments to evaluate performance using the six competencies framework, without much guidance from the ACGME. While there is currently increased documentation, there is little evidence supporting whether these processes improved either resident education or graduate performance. Little Support for Pass/Fail Moving to a pass/fail USMLE means residency programs need to develop additional criteria for residency selection, and they need to do so this year. Given the current resource limitations faced by residency programs, developing objective measures to better assess and evaluate medical students for residency is a daunting task. The FSMB and NBME, however, see this as “an important first step toward facilitating broader, system-wide changes to improve the transition from undergraduate to graduate medical education” [11]. But data supporting the change to pass/fail for the Step 1 scores are limited. And, in fact, medical students, faculty, and program directors do not favor the change to pass/fail. Specifically, a summary report on the USMLE scoring [16] found that only 26% of program directors, 39% of medical school faculty, and 44% of medical students surveyed were in favor of this proposal. There is nothing wrong with using the USMLE scores as a “floor” for screening applicants, regardless of specialty. The FSMB and NBME should have worked with the ACGME, ABMS, and LCME to develop a replacement for the USMLE score that is also predictive of ABOS Part 1 examination passage. Instead, the FSMB and NBME hastily removed what is admittedly an imperfect assessment. Looking Ahead: How Will Programs Adapt? The decision for USMLE pass/fail will force programs to use other measures to assess candidates. Residency programs will still need to be reasonably assured their graduates will be able to take and successfully pass the ABOS Part 1 certifying examination. For example, clinical surgical rotations accompanied by an exam will become more valuable. Grades for these clinical rotations will also take on a greater importance. Additionally, the lack of objective scores makes it harder for residency programs to compare candidates from different schools. Therefore, students from less-preeminent medical schools may be offered fewer interviews or experience a lower likelihood of being selected for orthopaedic residencies. Likewise, there may also be unintended consequences for the residents in the program. The Orthopedic In-Training Examination given annually to residents may be scrutinized more carefully, particularly in PGY-1 through PGY-3 years, to ensure that the resident can pass Part 1 of their ABOS exam. Thus, a tool to provide feedback and improvement may actually become a “high-stakes” exam during residency because programs need to be assured their residents will pass certifying examinations. It should be a priority for the accrediting bodies, when making changes, to show they are improving the process or outcomes. That did not take place when FSMB and NBME made the decision to move to pass/fail.
Published in: Clinical Orthopaedics and Related Research
Volume 479, Issue 6, pp. 1194-1196