The QS World Universities Ranking, explained

The QS World Universities Ranking, explained

The QS World Universities Ranking has a reputation for being inaccurate. However, I think there’s some value in understanding it as it is now the most widely followed ranking among the general Chinese population as well as HR departments of companies in China.

There are several potential reasons to care about this ranking:

  • You need to 落户 in Shanghai or Beijing, and certain 落户 requirements reference the QS ranking.
  • Your target employers have a “target school” list based on the QS ranking.
  • Your family follows the QS ranking, and cares whether you go to a “Global Top X” university.
  • You want to use the QS ranking to find the best possible school/program to maximize your educational and/or employment outcomes.

I’m going to skip the first three reasons because everyone’s specific situations are different. The last reason, however, should always be legitimate for anyone who is pursuing studies abroad. Thus, I’ll discuss the QS rankings from the perspective of ranking programs “accurately” with regards to their facilitation of your education and/or employment outcomes.


Ranking criteria

Let’s start by understanding the QS World University Ranking’s methodology. There are six criteria with different weighting (that total to 100%). All other factors have no impact on the ranking. The six criteria are:

  1. Academic reputation (40%). QS sends out an Academic Survey that collects the “expert opinions of over 130,000 individuals in higher education” each year regarding teaching and research quality at the world’s universities.
  2. Employer reputation (10%). QS also sends out an Employer Survey that collects responses from 75,000 employers who “identify those institutions from which they source the most competent, innovative, effective graduates.”
  3. Faculty/Student ratio (20%). According to QS, this is the best metric for teaching quality.
  4. Citations per faculty (20%). According to QS, research output has the same importance as teaching quality.
  5. International faculty ratio/international student ratio (5% each). Diversity and a global outlook are accounted for in these two metrics.

I’ll point out what I believe to be three flaws in the above methodology.

Flaw #1: Regional bias

Half of the weighting is assigned to reputation metrics, which has the problem of regional bias.

To give an example of this, let’s say you asked 100 random people in China to list all the Chinese universities they know and all the American universities they know. Then, do the same with 100 random people in the United States. You’d find that Chinese respondents can name many more Chinese universities than American ones, while that American respondents can name many more American universities than Chinese ones.

Though this bias may be less pronounced among academics, it’s nonetheless almost certainly a significant factor. Outside of the undisputed global powerhouses (ie. Harvard, Stanford, Oxford, Cambridge, and MIT - the top 5 in the QS ranking), there would be a major divergence in opinions especially since there isn’t a clear standard for comparison.

To give an analogy, it’s like asking 100,000 sports professionals to list the top 100 athletes of all time. We might get a very clear top 5 or top 10 that includes the greatest of all time in each major sport - Pelé, Michael Jordan, Tiger Woods, Wayne Gretzky, Usain Bolt, Muhammad Ali, Michael Phelps, etc. However, there are likely 100,000 different lists for #11 through #100. Who’s the better athlete - Kobe Bryant or Carl Lewis? Mickey Mantle or Joe Frazier? Pete Sampras or Joe Montana? Who knows. It probably depends on whether you’re asking a fan of basketball, track and field, baseball, boxing, tennis, or American football.

Flaw #2: Absence of graduate outcomes criteria

I expect that the vast majority of university students and alumni would agree that a school’s graduate outcomes is one of the major ways of determining the quality of its education and resources. What percentage of students begin careers in high-paying jobs? How many go on to graduate school and positions in academia? How many start successful companies?

I’d argue that metrics such as number of Nobel Prizes won by graduates and number of billionaire alumni may be more relevant indicators of a school’s quality, since they’re proxies for assessing a school’s ability to produce outlier success - the kind that drives mankind forward. (To ensure fair comparisons, we could control for variables such as institution age and number of alumni, which if unadjusted would give an advantage to older and larger universities.)

Flaw #3: Absence of admissions selectivity criteria

How hard is it to get in? This is a good proxy for the desirability of a school in the eyes of applicants. A potential objection might be that schools in different regions have different admissions requirements. We could address this by sticking to global standards. For example, I think it would be fair to compare IB averages of matriculating undergraduate students or GRE averages of matriculating graduate students.


Given the QS ranking’s flaws, how could we still leverage it to determine the approximate “quality” of individual schools? My suggestions are below.

Separate by region for better results

While the overall ranking might seem inaccurate, separating schools by region produces much more useful lists. The reason is: outside the top 5 global universities, regional/relative strength matters more than absolute strength.

For example, the National University of Singapore enjoys a reputation in Singapore that’s likely better than any global university except the likes of Harvard/Stanford and Oxbridge. NUS is a “target school” for essentially all employers in Singapore. However, in the United States, an NUS graduate may have little to no advantage over a graduate from a local state university.

Leverage LinkedIn

For your target school/program, you can do your own research on where graduates go after their studies. What percentage enter lucrative careers or pursue graduate studies at top global universities? Some universities (especially their business schools) may even publish comprehensive employment outcomes for some programs.

Be aware of omissions

The QS World University Ranking does not include business schools such as London Business School, INSEAD, and France’s grande écoles (eg. HEC Paris). These institutions only have graduate programs, which may be why they were excluded. If you’re an applicant to graduate programs, you can refer to more specialized rankings such as the QS Business Masters ranking or the QS Global MBA ranking.

The overall QS World University Ranking also doesn’t include American liberal arts colleges such as Williams, Amherst, and Swarthmore. In some ways, they’re the opposite of the business schools. Liberal arts colleges generally focus on undergraduate education, offering few (if any) graduate school programs. If you’re interested in applying to these institutions, you can refer to the liberal arts college rankings published by U.S. News.


“Mistakes” in the QS ranking

As long as we separate the QS World Universities Ranking by geographic region, I think it’s reasonably accurate. However, there are some notable exceptions:

  • Dartmouth (#191). This is perhaps the most egregious. An Ivy League institution (even the least well-known one) should not be ranked behind 190 other universities.
  • Almost no one in the US would realistically choose to attend UC San Diego (#48) over Duke (#52) or Brown (#60).
  • Almost no one in the UK would realistically choose to attend Edinburgh (#16), Manchester (#27), or KCL (#35) over LSE (#49).