In 1999, Yongjun Huang was graduating from high school in Xindu, China, and looking to study mechanical engineering. Now a Georgia Tech graduate, he says he "compared dozens of schools" in China and the US with the only tools available: "traditional reputation" and the college rankings in US News & World Report and the Princeton Review.
But by the time another young Xindu student was looking to study abroad in 2009, the rankings industry had boomed, says Qiuyun Wang, whose parents expected her "to go to a world-famous college." A dozen rankings focused solely on the US and about 10 compared universities around the globe.
The Huangs and the Wangs aren't the only ones who care about the ratings. When the US lost four slots in the top 200 of the Times Higher Education World University Rankings, and Asia claimed five more spots in 2009, headline writers were quick to declare the US was losing dominance in higher education. Universities and colleges monitor rankings as carefully as TV networks monitor the Nielsens, and when institutions seek partners for joint programs or exchanges, they check to make sure they don't marry down.
But for all the reliance on rankings, there is little consensus on what they mean.
[T]here is a limited number of measures that are genuinely useful across nations," says Phil Baty, editor of the Times Higher Education rankings, a leader in the field based in London. He says it's nearly impossible to get hard data about quality of research and teaching, but there are "proxies."
For the Academic Ranking of World Universities, out of Shanghai, one "proxy" is the number of graduates who later won prestigious awards, such as Nobel prizes (Harvard University is top). The Webometrics Ranking of World's Universities, compiled by the Spanish National Research Council, considers an institution's Web presence – e-journals, archived documents, Internet-based scholarly exchanges – as a gauge of research performance. (Harvard, again, is top.)
Mr. Baty's group looks at the volume of papers published per number of faculty to gauge brainpower, and at how many times an institution's research is cited to gauge importance of research. (Overall top: Harvard.) Even this straightforward approach has pitfalls. Not every discipline generates the same volume of research papers and citations, Baty explains, so this favors institutions heavy in the hard sciences and medicine and "really disadvantages" some excellent US liberal arts colleges. His group now weights data differently, and expects places like Stanford (which ranked 16th behind Cornell, Duke, and Johns Hopkins in 2009) to rise.
Assessing teaching quality is trickier: Nobody yet has a way to compare what students in different systems have learned, says Baty, so the "proxy" is to ask faculty to rate institutions. It's controversial because there's typically no oversight of who actually fills out the surveys. To improve its reliability, Baty's organization is targeting select senior professors with "action-based questions," he explains. "Rather than saying 'Where is the best English or medicine department,' we're asking 'If you were sending your talented undergraduate for further study, which institution would you choose?"
Baty believes this inform future rankings. Still, there will always be those who, like Miss Wang, place higher value on their own priorities than rankings. She decided early on that she wanted a small women's college and, despite her parents' entreaties, chose Agnes Scott College in Georgia (59th in US News & World Report listings of liberal arts colleges and not noted in world rankings).