Thursday, January 27, 2011

All Ears

Times Higher Education and Thomson Reuters are considering changes to their ranking methodology. It seems that the research impact indicator (citations) will figure prominently in their considerations.  Phil Baty writes:

In a consultation document circulated to the platform group, Thomson Reuters suggests a range of changes for 2011-12.


A key element of the 2010-11 rankings was a "research influence" indicator, which looked at the number of citations for each paper published by an institution. It drew on some 25 million citations from 5 million articles published over five years, and the data were normalised to reflect variations in citation volume between disciplines.


Thomson Reuters and THE are now consulting on ways to moderate the effect of rare, exceptionally highly cited papers, which could boost the performance of a university with a low publication volume.


One option would be to increase the minimum publication threshold for inclusion in the rankings, which in 2010 was 50 papers a year.


Feedback is also sought on modifications to citation data reflecting different regions' citation behaviour.


Thomson Reuters said that the modifications had allowed "smaller institutions with good but not outstanding impact in low-cited countries" to benefit.
It would be very wise to do something drastic about the citations indicator. According to last year's rankings, Alexandria university is the fourth best university in the world for research impact, Hong Kong Baptist University is second in Asia, Ecole Normale Superieure Paris best in Europe with Royal Holloway University of London fourth, University of California Santa Cruz fourth in the USA and the University of Adelaide best in Australia.

If anyone would like to justify these results they are welcome to post a comment.

I would like to make these suggestions for modifying the citations indicator.

Do not count self-citations, citations to the same journal in which a paper is published or citations to the same university. This would reduce, although not completely eliminate, manipulation of the citation system. If this is not done there will be massive self citation and citation of friends and colleagues. It might even be possible to implement a measure of net citation by deducting citations from an institution from the citations to it, thus reduce the effect of tacit citation agreements.

Normalisation by subject field is probably going to stay. It is reasonable that some consideration should be given to scholars who work in fields where citations are delayed and infrequent. However, it should be recognised that the purpose of this procedure is to identify pockets of excellent and research institutions are not built around a few pockets or even a single one. There are many ways of measuring research impact and this is just one of them. Others that might be used include total citations, citations per faculty, citations per research income and h-index.

Normalisation by year is especially problematical and should be dropped. It means that a handful of citations  to an article classified as being in a low-citation discipline in the same year could dramatically multiply the score for this indicator. It also introduces an element of potential instability. Even if the methodology remains completely unchanged this year, Alexandria and Bilkent and others are going to drop scores of places as papers go on receiving citations but get less value from them as the benchmark number rises.

Raising the threshold of number of publications might not be a good idea. It is certainly true that Leiden University have a threshold of 400 publications a year but Leiden is measuring only research impact while THE and TR are measuring a variety of indicators. There are already too many blank spaces in these rankings and their credibility will be further undermined if universities are not assessed on an indicator with such a large weighting.

Tuesday, January 25, 2011

More Rankings

With acknowledgements to Registrarism, here is news about more new rankings.

Universities are being ranked according to the popularity of their Twitter accounts. According  to an article in the Chronicle of Higher Education:

Stanford earned a Klout score of 70, with Syracuse University, Harvard University, and the University of Wisconsin at Madison all following with a score of 64.


The top 10 is rounded out by University of California at Berkeley, Butler University, Temple University, Tufts University, the University of Minnesota, the University of Texas at Austin, and Marquette University.
Another is the Green Metric Ranking of World Universities, compiled by Universitas Indonesia. The criteria are  green statistics, energy and climate change, waste, water and transportation.

The top five are:

1.  Berkeley (Surprise!)
2.  Nottingham
3.  York (Canada)
4.  Northeastern
5.  Cornell

Universiti Putra Malaysia is 6th and Universitas Indonesia 15th.
Cambridge and Harvard

After the break with THE, QS decided to continue the old methodology of the 2004-2009 rankings. At least, that is what they said. It was therefore surprising to see that, according to data provided by QS,  there were in fact a number of noticeable rises and falls between 2009 and 2010 although nothing like as much as in previous years.


For example the University of Munich fell from 66th place to 98th place, the Free University of Berlin from 70th to 94th and Stockholm University from 168th to 215th while University College Dublin rose from 114th to 89th and Wurzburg from 309th to 215th.

But perhaps the most remarkable news was that Cambridge replaced Harvard as the world's best university. In every other ranking Harvard is well ahead.

So how did it happen? According to Martin Ince, “Harvard has taken more students since the last rankings were compiled without an equivalent increase in the number of academics.”

In other words there should have been a lower faculty student ratio and therefore a lower score for this indicator. This in fact happened. Harvard’s score went from 98 to 97.

Ince also says that there was an “improvement in staffing levels”, at Cambridge, presumably meaning that there was an increase in the number of faculty relative to the number of students. Between 2009 and 2010 Cambridge’s score for the student faculty remained the same at 100 which is consistent with Ince’s claim.

In addition to this, there was a "significant growth in the number of citations per faculty member" for Cambridge. It is not impossible that the number of citations racked up by Cambridge had  risen relative to Harvard but the QS indicator counts citations over a five year so even a substantial increase in publications or citations would take a few years to have an equivalent effect on this indicator. Also note that this indicator is citations per faculty and it appears that the number of faculty at Cambridge has gone up relative to Harvard. So we would expect any increase in citations to be cancelled out by a similar increase in faculty.

It looks a little odd then that for this indicator the Cambridge score rose from 89 to 93, four points, which is worth 0.8 in the weighted total score. That, by the way, was the difference between Harvard and Cambridge in 2009.

The oddity is compounded when we look at other high ranking universities. between 2009 and 2010 Leiden's score for citations per faculty rose from 97 to 99, Emory from 90 to 95, Oxford from 80 to 84, Florida from 70 to 75.

It would at first sight appear plausible that if Harvard, the top scorer in both years, did worse on this indicator then everybody or nearly everybody else would do better. But if we look at universities further down the table, we found the opposite. Between 2009 and 2010 for this indicator  Bochum fell from 43 to 34, Ghent from 43 to 37, Belfast from 44 to 35 and so on.

Could it be that there was some subtle and unannounced change in the method by which the raw scores were transformed into  indicator scores. Is it just a coincidence that the change was sufficient to erase the difference between Harvard and Cambridge?








http://www.wiziq.com/tutorial/90743-QS-World-University_Rankings-top-500

Thursday, January 20, 2011

Comparing Rankings

International university rankings are now proliferating in much the same way that American rankings have multiplied over the last few decades although so far there is no global equivalent to top party schools or best colleges for squirrels.

It is now time for a cursory review and comparison of the major international rankings. I will omit recent rankings, those that look as though they may not be repeated or those that provide insufficient information about methodology.

The list is as follows:

Academic Ranking of World Universities

Higher Education Evaluation and Accreditation Council of Taiwan

International Professional Ranking of Higher Education Institutions (Paris Mines Tech)

Leiden Ranking

QS World University rankings

Scimago Institutions Ranking

THE World University Rankings

Webometrics Ranking of World Universities

The first attribute to be considered is simply the number of universities ranked. A ranking might have an impeccable methodology and analyse a score  of indicators with the strictest attention to current bibliometric theory and statistical technique. If, however, it only ranks a few hundred universities it is of no use to those interested in the thousands left outside the elite of the ranked.

I am counting the number of universities in published rankings. Here the winner is clearly Webometrics, followed by Scimago.

Webometrics 12,300

Scimago 1,955

QS WUR 616

ARWU 500

HEEACT 500
 
Leiden 500

THE WUR 400

Paris Mines 376

Monday, January 17, 2011

Shanghai Ranks Macedonia

I am not sure how accurate the folowing report is. The whole of Macedonia has never had a Nobel or Fields award winner or an ISI highly cited researcher, has published fewer articles than Loughborugh University and has no articles in Nature or Science. It is difficult to see just what a team of seven from Shanghai would evaluate, especially since ARWU is reluctant to get involved with teaching quality or publications in the arts and humanities. Still, it is perhaps indicative that a European country has turned to China to evaluate its universities.

"Shanghai Jiao Tong University, which analyzes the top universities in the world on quality of faculty, research output quality of education and performance, has been selected to evaluate the public and private institutions for higher education in Macedonia, Minister of Education and Science Nikola Todorov told reporters on Sunday.
The ranking team included the Shanghai University Director, Executive and six members of the University's Center, Todorov said, pointing out that Macedonia is to be the first country from the region to be part of the the Academic Ranking of World Universities (ARWU), commonly known as the Shanghai ranking.


"The Shanghai ranking list is the most relevant in the world, and being part of it is a matter of prestige. We shall be honored our institutions for higher education to be evaluated by this university. This is going to be a revolution in the education sector, as for the first time we are offered an opportunity to see where we stand in regard to the quality," Todorov said"





Sunday, January 16, 2011

Full QS Rankings 2010

QS have published  full details including indicator scores of the top 400 universities in their 2010 rankings. In the transparency stakes this brings them level with THE who have an iphone/ipad app that provides these details for the main indicators but not the sub-indicators.

Thursday, January 13, 2011

Microsoft Academic Search

Microsoft has developed a computer science research ranking. Organisations, mainly but not entirely universities, are ranked according to number of citations and there is also data on publications and the h-index.

The top five in the world are Stanford, MIT, Berkeley, Carnegie-Mellon and Microsoft. Harvard is seventh and Cambridge 18th.

Top regional universities are:
Africa -- Cape Town
Asia and Oceania -- Tel Aviv
Europe -- Cambridge
North America --  Stanford
South America --  Universidade de Sao Paulo

Monday, January 10, 2011

The Disposable Academic

An article in the Economist  (print edition, 18-31/12/2010, 146-8) analyses the plight of many of the world's Ph Ds. Many can expect nothing more a succession of miserable post-doc fellowships, short-term contracts or part-time jobs teaching remedial or matriculation classes. And those are the lucky ones who actually get their diploma.

It seems that the financial return for a Ph D is only marginally higher than that for a master's. Since there are undoubtedly variations by institution and discipline, it follows that for many joining a doctoral program is a losing proposition in every way.

One wonders whether countries like South Africa and some in Southeast Asia are creating future problems in the drive to boost the production of Ph Ds.

Friday, January 07, 2011

Value for Money

An article by Richard Vedder describes how the publication of data by Texas A and M University shows enormous variation in the cost of faculty members per student taught.

I recently asked my student research assistant to explore this data by choosing, more or less at random, 40 professors of the university's main campus at College Station — including one highly paid professor with a very modest teaching load in each department and one instructor who is modestly paid but teaches many students. The findings were startling, even for a veteran professor like myself.


The 20 high-paid professors made, on average, over $200,000 each, totaling a little over $5 million annually to the university. These professors collectively taught 125 students last year, or roughly $40,000 per student. Since a typical student takes about 10 courses a year, the average cost of educating a student exclusively with this group of professors would be about $400,000, excluding other costs beyond faculty salaries.
There are of course questions to be asked about whether the data included the supervision of dissertations and the difficulty of the courses taught. Even so, the results deserve close scrutiny and might even be a model for some sort of international comparison.

Tuesday, January 04, 2011

Dumbing Down of University Grades

An article in the London Daily Telegraph shows that the number of first and upper second class degrees awarded by British universities has risen steadily over the last few decades. Their value to employers as an indicator of student quality has accordingly diminished.

David Barrett reports that:


The latest data shows that the criteria for awarding degrees has changed dramatically - despite complaints from many universities that grade inflation at A-level has made it hard for them to select candidates.

Traditionally, first class honours have been awarded sparingly to students who show exceptional depth of knowledge and originality.


But the new figures add further weight to a report by MPs last year which found that "inconsistency in standards is rife" and accused vice-chancellors of "defensive complacency".

We might note that the THE-QS rankings until 2009 and the QS rankings of last year  have probably done quite a lot to encourage complacency by consistently overrating British universities especially Oxbridge and the London colleges.