Prestigious medical schools hide vital information from radiology residency programs

When selecting trainee applicants, should radiology residency programs prefer graduates of big-name medical schools regardless of individual academic performances?

Or would they do better to favor high-achieving students from schools of more modest status?

The riddle may be moot. Many medical schools that annually rank high in US News & World Report are withholding granular performance data from residency directors, banking on their brand alone to ensure desirable residency placements for their students.

Meanwhile lower-ranked schools are pressured by this dynamic to disclose their students’ academic records fully and without hesitation.

The phenomenon is fleshed out in a study published Jan. 20 in JACR [1].

Conducted by radiology researchers at Duke University School of Medicine (a perennial top-10 finisher in US News rankings), the study builds on prior research showing widespread inconsistency in the depth of data schools supply via Medical Student Performance Evaluations (MSPEs).

The present study follows not long after a recent survey showed radiology residency directors count on the MSPE comparative summaries to serve as the “most impactful” insight for helping them identify strong matches for residency openings [2].

The authors suggest high-ranking schools that keep such information from residency program directors are exercising an “elite privilege” and, in the process, doing a disservice to all stakeholders.  

 

Representative sample, disappointing findings

In the present study, Charles Maxfield, MD, and colleagues reviewed all applications received by a single radiology residency program during the 2021-22 application cycle. The sample comprised 1,046 applications representing 95% of the allopathic and osteopathic medical schools in the U.S.

Classifying the associated MSPEs as pass/fail—no individualized data or rankings—vs. multi-tiered (inclusive of individualized data and rankings), the team found:

> For preclinical classes, no schools at all in US News’s top 10 shared multi-tiered information. This compared with 17% of schools ranked No. 11 to No. 50; 52% of schools ranked No. 51 to No. 100; and 59% of schools not named in the US News rankings.

> None of the top 10 schools shared comparative assessments. This compared with 56% of schools ranked No. 11 to No. 50; 80% of those ranked No. 51 to No. 100; and 81% of unranked schools.

> For core clinical clerkships, 70% of the top 10 schools shared multi-tiered information. This too compared unfavorably with less-acclaimed institutions: 90% of schools ranked No. 11 to No. 50; 94% of those ranked No. 51 to No. 100; and 94% of unranked schools.  

 

Residency program directors forced to seek ‘alternative measures’ of performance and achievement

In their discussion, Maxfield and colleagues cite a published letter calling for schools to include deep comparative data with all MSPEs. The letter was sent more than 30 years ago to the Association of American Medical Colleges by a high-level AAMC advisory committee.

Maxfield and co-authors underscore the freshly heightened importance of voluntary compliance with this recommendation given the recent omittance of certain numerical scores from medical licensing exams.

Residency program directors now have to “seek alternative measures of performance and achievement to identify residents who can be successful in their programs,” the authors write. “Radiology residency program directors expect the MSPE to partially fill that void.”

Their review of radiology residency applications in the 2021-22 application cycle, they emphasize, “concurs with prior studies that show variable compliance with the AAMC guidelines.”

The present study further proves that the degree to which medical schools offer comparative student data correlates closely with rankings in—and perceived prestige thanks to—US News & World Report:

Higher ranked schools are less likely than lower ranked and unranked schools to provide comparative data on their students.”

 

Weaker students at top schools might appear to benefit, but initial appearances can be deceiving

Maxfield and co-authors suggest top medical schools’ broad embrace of non-transparency should trouble stakeholders because it has “wide implications” for U.S. healthcare:

Obfuscation of grades and class ranking hurts high achieving students, including high achieving underrepresented in medicine (URiM) students, and particularly those at lower ranked schools, and it hurts residency program directors, who have less information with which to select residents who are likely to succeed in their program.”

Meanwhile weaker students at top schools “might appear to benefit from this dynamic, but perhaps not if they are an academic mismatch at their chosen residency program,” the authors add.

Maxfield et al. assert that a medical school’s goal for its students “should not simply be to get the best residency program but the best match.”

Given the self-evident soundness of this principle, “it should be in the best interest of all stakeholders”—residency program directors, students, student affairs officers—“for medical schools to provide comprehensive, transparent evaluations to optimize the chances of good matches.”

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup