Limited numbers: what university rankings can (and can’t) tell us

The Conversation    |    4 October 2012

The release of The Times Higher Education World University Rankings will be welcomed by many people in the Australian university sector.

Australia now has eight universities in the top 200, one more than last year, with the University of Adelaide joining this top grouping at 176.

Six of this group improved their positions, with Melbourne University rising to 28 (up from 37 last year) while ANU, moved from 38 to 37.

The other Australian universities were: Sydney University (62), Queensland University (65), UNSW (85), Monash (99) and UWA (190).

Australian Institution 2011-12 rank 2012-13 rank
University of Melbourne 37 28
Australian National University 38 37
University of Sydney 58 62
Queensland University 74 65
University of New South Wales 173 85
Monash University 117 99
University of Adelaide 201-225 176
University of Western Australia 189 190

 

But an important story here is not how well individual universities have done, but what the rankings say about the sector overall.

 

What goes into the rankings ‘sausage’

While this is generally good news for Australian universities, we should view the results carefully and be mindful of the limitations in the story they tell us.  The challenge of rankings, is recognising their value without using them in perverse ways.

The Times Higher Education (THE) ranking was developed in 2004, based initially on surveys of reputation, staffing ratios, research and other indicators.  In recent years, the majority of the ranking (around 70%) has been derived from indicators around research, predominantly research reputation, citations and funding. The rest of the ranking captures teaching, learning and a few other markers.

This creates limits in what the rankings can tell us.  The reliance on surveys means that “perceptions” can distort the story that (more) objective indicators, such as citations, tell us about an institution.  The rankings become a delicate mix of hard data and ingrained prejudice by the world’s academics.

A second issue is the (necessarily) retrospective nature of much of the hard data.  Citations measures pick up research often started a decade or longer ago, failing to tell us much about important research just building momentum now.

The bigger picture

Despite the limitations in what any ranking can tell us, they still have an important story to tell.  It is not what individual institutions have done in the past or how their peers view them, but rather that the Australian system is doing well, and in this way we “punches above our weight” (to use the obligatory phrase since Les Darcy when discussing anything Australia does well in the international arena).

Like any imperfect proxy, the rankings of individual institutions hint at the health of the system overall, even if there are inevitable instances where we can do better. As the Times press release reflects, Australia does well on the average movement of our top 200 institutions, with our universities from this top group raising an average of 15 places.

The other established (though bigger) systems in the US and UK had institutions demoted.

Australia has also joined the increasing appearance in the top rankings of universities in the Asia-Pacific, which are challenging the position of the long established university systems around the world. This indicates perhaps that on reputation Australian academics appear increasingly well-regarded.

Using rankings wisely

The trick then is to appreciate that rankings can be a useful proxy, giving us a partial picture of what is going on at universities. But they don’t yet capture all the valuable features of Australian higher education.

We should not fund universities based on their performance in the Times or other major rankings such as the Academic Ranking of World Universities (ARWU) from Shanghai Jiao Tong University. As much as this would be another way to reward excellence, much of what universities accomplish is not (yet) picked up in any ranking system.

For instance, a great achievement of the Australian system is its commitment to broad access higher education on a massive scale. But this is an achievement that probably has not so far won any university a higher placing.

Rankings then are useful as the proverbial canary in the coal (research) mine, telling us over a period of years where our performance is lagging. As long as we remember that by the time our canary shows signs of serious ill health, the problem (lagging research performance) is probably worse than we think.

So they may tell us where we are doing better in research performance and the popularity contest, but this can only be so useful over the long run. Popularity and reputation, may fail to tell us where research that makes the lives of Australians better, or builds our national prosperity, is occurring today. Or where the best education is on offer here and now.

Gwilym Croucher works for the University of Melbourne as a policy analyst in the Office of the Vice-Chancellor.

This article was originally published at The Conversation.
Read the original article.

Follow

Get every new post delivered to your Inbox.

Join 182 other followers

Powered by WordPress.com