One might suspect it was sour grapes by a medical school unhappy with its rankings.
The school poured four years of research and almost $10,000 of its own grant funding into a stinging academic critique of the news magazine’s ranking methodology.
“These findings raise questions regarding the ranking’s validity and usefulness,” said the paper, published this month in the journal Academic Medicine.
A primary problem, the paper said, is that the magazine essentially is conducting an opinion survey. This results in wild swings in a medical school’s year-to-year standings.
Take UC Davis, for instance. From 2006 to 2014, it has occupied the following slots, respectively: 17, 21, 30, 26, 35, 20, 41, 24, 19.
Dr. Anthony Jerant is the senior author of the paper and has been a family medicine educator for 20 years. He believes schools cannot possibly change so rapidly and drastically as to merit the wide spread seen in UC Davis’ rankings.
“Would you expect to see that big of a shift?” Jerant said. “It makes no sense how we bounced around so many positions.”
The same has happened to innumerable other medical schools. Part of the problem is that the ranking depends on how many people return questionnaires, and that the survey relies largely on reputation instead of pertinent data.
The UC Davis research paper, titled “Short-Term Stability and Spread of the U.S. News and World Report Primary Care Medical School Rankings,” was highly critical. It called one of the magazine’s assumptions “likely spurious.”
The paper is not the first to question the rankings’ reliability. It turns out the lists – which rate colleges, law schools, medical schools and graduate programs – have attracted widespread skepticism.
As far back as the year 2000, Washington Monthly published an article titled “How U.S. News mismeasures higher education and what we can do about it.”
Joining the debate since then have been Cornell University’s Higher Education Research Institute, The New York Times, the Journal of the American Medical Association and a couple dozen other professional and academic publications.
Even U.S. News & World Report published an article in September 2012 titled: “How U.S. News calculates its best colleges rankings: The stats can’t tell you everything you need to know, but they say a lot.”
“The authors of the (UC Davis) article really misunderstood our rankings and have come to some false conclusions based on that misunderstanding,” Morse said in an email.
Mark Henderson, associate dean for admissions and outreach for the UCD School of Medicine, said opinion surveys like the magazine’s result in “a very, very subjective and flawed methodology.”
Henderson was not involved in the research project, but emphasized that its purpose was to evaluate the magazine’s rating tools, not to attempt to elevate UC Davis’ reputation.
“I’m not sure the paper paints UC Davis in a better – or worse – light,” he said. “We’ve always been pretty much in the 20s range.”
Jerant said he hopes the research will spur development of meaningful measurements that take into account, for example, how well a medical school serves its community.
Another portion of the magazine’s methodology that’s questionable, Henderson said, is its reliance on grades and test scores of its students to rank schools.
Often, the best primary care doctors are those from diverse socio-economic backgrounds who are committed to community but may not earn the highest marks, he said.
So why, if the methodology is imperfect, is the U.S. News & World Report numeric ranking so universally accepted by the public?
First, there’s nothing else to challenge its prominence in the market, UC Davis officials said.
Second, it satisfies public demand for a tangible way to rate the landscape of the nation’s 125 to 130 medical schools.
Henderson said it has a lot to do with how the brain perceives rankings and lists.
“People like a number,” he said. “They feel comfortable with a numeric ranking, like it has value.”
Witness the recent explosion online of what’s been termed “listicles,” articles that begin with “Ten things you need to know … ” or “Top five tips for today.”
Jerant said consumers, students and faculty should be skeptical of numeric rankings because an accurate rating of medical schools would involve a lot more complexity.
Still, he said, there’s a cognitive effect that immediately kicks in when people see lists ranked 1 to 10, for example.
“If you’re trying to influence students, numbers may work because there’s a real human tendency to create order in the brain.”
About the Author: email@example.com