The different world university rankings are somewhat controversial. Their methodology are often dissected to show flaws and universities often dislike them because of the influence they have come to wield. Not one ranking has been hailed as “the” definitive ranking and this is why we have so many. If your institution is well ranked in one, you usually think it’s good vs. the other poorly done ranking that has you at the bottom of the scale since its selections or point system are just plain wrong.
So how do you make heads or tails of all of this? Don’t worry, we won’t try. We simply decided to take all of them and average out the rankings to see who is where.
How did we do this? First we identified world rankings. A good list can be found at the following wikipedia page
. We only kept the world (not regional) rankings that have posted a minimum of 400 universities and are at least dated 2011.
The remaining 9 are:
- Academic Ranking of World Universities (ARWU) - commonly known as the Shanghai Ranking – from China - 2012
- G-factor – from the USA - 2011
- HEEACT—Ranking of Scientific Papers – from Taiwan -2012
- Leiden Rankings from the Netherlands - 2011/2012
- QS World University Rankings – from the United Kingdom - 2012
- SCImago Institutions Rankings – from Spain - 2012
- Times Higher Education World University Rankings (or THE World University Rankings) – from the United Kingdom - 2012-13
- University Ranking By Academic Performance (UR by AP) – from Turkey - 2012
- Webometrics – from Spain - 2013
We compiled the data, matched up the universities and averaged out the numbers. Only the universities that are on all 9 rankings are in this list.
For those who have a keen eye.
- In the G-factor column, the #6 is missing. It’s actually ranked at the 122nd place.
- In the Leiden column, the # 4 only has 8 rankings (it doesn’t have the Times Higher Education ranking) and the #10 is ranked in the 119th.
- In the THE column, there are two #2. This is not an error, it's a tie in the Times Higher Education ranking.