The Fallacy of Medical School Rankings

Nearly everyone in the academic medical community knows that the U.S. News & World Report (USN & WR) annual rankings of American medical schools (and other higher and professional education programs) are defective.  Scores of journal articles have been published that criticize USN & WR for its flawed methods in producing the annual rankings and ignoring the academic backlash.  Medical school deans and academic thought leaders acknowledge the dubious quality of the rankings, despite occasional remarks (usually from highly ranked schools) to the effect that, “. . . the USN & WR rankings are all we have . . . there are no other measures to gauge medical school quality.”  I disagree.  Mission based metrics that address the values and goals of individual medical schools (e.g., public health, research, health care team training, student and faculty diversity, religious convictions, educational excellence and value) are more valid benchmarks to gauge school success.

More than a decade ago, in 2001, Jason Thompson and I concluded our critique of the USN & WR rankings of American medical schools with this statement:

“The annual U.S. News & World report rankings of U.S. medical schools are ill-conceived; are unscientific; are conducted poorly; ignore the value of school accreditation; judge medical school quality from a narrow, elitist perspective; do not consider social and professional outcomes in program quality calculations; and fail to meet basic standards of journalistic ethics. The U.S. medical education community, higher education scholars, the journalism profession, and the public should ignore this annual marketing shell game.”

My conclusions have not changed, despite the staying power of the USN & WR rankings.  One would think that thereport authored by Drs. Dan Tancredi, Klea Bertakis, and Anthony Jerant titled, “Short-term stability and spread of the U.S. News & World Report Primary Care Medical School Rankings” in the August 2013 issue of Academic Medicine, accompanied by the commentary “From Rankings to Mission” written by Drs. Darrell Kirch and John Prescott would close the door on this sad legacy.  These reports show once again the fallacy of arbitrary medical school rankings.  They advance the argument for mission based metrics – medical school outcome measures that show leadership can be expressed in many different ways.

Such metrics acknowledge the uniqueness of individual medical schools toward the universal goal of educating a highly competent medical workforce to provide for the health care needs of the nation.  Several illustrations include the “health of the public” emphasis at the University of New Mexico; education of minority physicians at Howard, Morehead, and the University of Illinois; and the education of medical missionaries at Loma Linda.  These are all special and valued missions that attract faculty, students, and resources to reach their goals.  These and other missions don’t show up in the USN & WR rankings.

But don’t expect the rankings to go away.  The USN & WR medical school rankings will stick around as long as they sell magazines and make money from a gullible public.  This is not an academic, public policy, or altruistic enterprise.  It’s all about commercial profit. Maybe P.T. Barnum was right with his observation, “There’s a sucker born every minute!”  To me, members of the academic medical community who believe that the USN & WR medical school rankings accurately reflect the quality of medical education offered at each school only strengthen and promote Barnum’s claim.

Original article – Academic Medicine Rounds – July 10, 2013

Filed Under: FeaturedGeneralScientific Focus

About the Author: publications@snma.org

RSSComments (0)

Trackback URL

Comments are closed.