Sunday, December 18, 2011

Leiden Ranking: Many Ways to Rate Research

My article on the Leiden Rankings in University World News can be found here.

 It looks as though a two-tier international university ranking system is emerging.

At the top we have the 'big three', Shanghai's Academic Ranking of World Universities, the QS World University Rankings and, since 2010, the Times Higher Education World University Rankings.

These receive massive attention from the media, are avidly followed by academics, students and other stakeholders and are often quoted in promotional literature. Graduation from a university included in these has even been proposed as a requirement for immigration.

Then we have the rankings by SCImago and Webometrics, both from Spain, the Performance Ranking of Scientific Papers for World Universities produced by the Higher education Evaluation and Accreditation Council of Taiwan, and the Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University.

These rankings get less publicity but are technically very competent and in some ways more reliable than the better-known rankings.

Wednesday, December 07, 2011

Update 6 on El Naschie vs Nature

There have been no reports for several days and the trial is now over. There will be a judgement in January.
What to do about the research bust

Mark Bauerlein has an article in the Chronicle of Higher Education on the disparity between the extraordinary effort and intelligence poured into scholarly writing in the humanities and the meager attention such writing receives.

"I devised a study of literary research in four English departments at public universities—the University of Georgia, the University at Buffalo, the University of Vermont, and the University of Illinois at Urbana-Champaign—collecting data on salaries, books and articles published, and the reception of those works. The findings:
  • Those universities pay regular English faculty, on average, around $25,000 a year to produce research. According to the faculty handbooks, although universities don't like to set explicit proportions, research counts as at least one-third of professors' duties, and we may calculate one-third of their salaries as research pay. This figure does not include sabbaticals, travel funds, and internal grants, not to mention benefits, making the one-third formula a conservative estimate.
  • Professors in those departments respond diligently, producing ample numbers of books and articles in recent years. At Georgia, from 2004 to 2009, current faculty members produced 22 authored books, 15 edited books, and 200 research essays. The award of tenure didn't produce any drop-off in publication, either. Senior professors continue their inquiries, making their departments consistently relevant and industrious research centers.
  • Finally, I calculated the impact of those publications by using Google Scholar and my own review of books published in specific areas to count citations. Here the impressive investment and productivity appear in sobering context. Of 13 research articles published by current SUNY-Buffalo professors in 2004, 11 of them received zero to two citations, one had five, one 12. Of 23 articles by Georgia professors in 2004, 16 received zero to two citations, four of them three to six, one eight, one 11, and one 16. "
Bauerlain suggests that these limited citation counts are telling us something, that talented scholars might find better things to do and that society might direct resources elsewhere.

The QS World University Rankings would apparently agree. Their citations indicator simply counts the total number of citations and divides it by the total number of faculty. This is a very crude measure, especially since it counts the current number of faculty but then counts the citations to articles written over a five year period. Any university seeking a boost in the QS rankings could simply axe a few English, history and philosophy specialists and replace them with oncologists and engineers. True, the world would lose studies about Emily Dickinson's Reluctant Ecology of Place, cited once according to Google Scholar, or Negotiations of Homoerotic Tradition in Paradise Regained, but if this was accompanied by even a small advance in cancer treatment, who would really care? There would be an even better effect on the Shanghai rankings which do not not count publications or citations in the humanities but still include the faculty in their productivity indicator.

But there are those who would argue that while disciplines go about citing differently they must be regarded as being on the same level in all other respects. Thomson Reuters, who collect the data for the Times Higher Education rankings, now normalise their data so that citations in a specific discipline in a specific country in a specific year are benchmarked against the average for that discipline in that country in that year. That would mean that the article by the Buffalo professors with five citations might look quite good.

I have a suggestion for those professors of English and other disciplines which hardly anyone seems to read anymore. Go to some Central Asian or East African republic where the papers in your field get only a few citations: the next article you write with its miserable handful of citations will be well above average for that country and your new university will suddenly perform well in the Times Higher rankings. Just make sure that your employer produces two hundred papers a year altogether.

Friday, December 02, 2011

European Universities and Rankings

Research Trends, the newsletter from Scopus,  reports on  a conference of European universities that discussed international rankings. The participants found positive aspects to rankings but also had criticisms:

Going through the comparison of the various methodologies, the report details what is actually measured, how the scores for indicators are measured, and how the final scores are calculated — and therefore what the results actually mean.
The first criticism of university rankings is that they tend to principally measure research activities and not teaching. Moreover, the ‘unintended consequences’ of the rankings are clear, with more and more institutions tending to modify their strategy in order to improve their position in the rankings instead of focusing on their main missions.
For some ranking systems, lack of transparency is a major concern, and the QS World University Ranking in particular was criticized for not being sufficiently transparent.
The report also reveals the subjectivity in the proxies chosen and in the weight attached to each, which leads to composite scores that reflect the ranking provider’s concept of quality (for example, it may be decided that a given indicator may count for 25% or 50% of overall assessment score, yet this choice reflects a subjective assessment of what is important for a high-quality institute). In addition, indicator scores are not absolute but relative measures, which can complicate comparisons of indicator scores. For example, if the indicator is number of students per faculty, what does a score of, say, 23 mean? That there are 23 students per faculty member? Or does it mean that this institute has 23% of the students per faculty compared with institutes with the highest number of students/faculty? Moreover, considering simple counts or relative values is not neutral. As an example, the Academic Ranking of World Universities ranking does not take into consideration the size of the institutions.

I am not sure these criticisms are entirely fair. It seems that the weighting of the various indicators in the Times Higher Education rankings emerged from a lot of to and fro-ing between various stakeholders and advisers. In the end, far too much weighting was given to citations but that is not quite the same as assigning arbitrary or subjective values.

The Shanghai rankings do have an indicator, productivity per capita , that takes  faculty size into account although it is only ten per cent of the total ranking. The problem here is that faculty in the humanities are counted but not their publications.

I am not sure why QS is being singled out with regard to transparency. The THE rankings are also, perhaps in a different way, quite opaque. Aggregate scores are given for teaching environment, research and international orientation without indicating the scores that make up these criteria.

So what is to be done?


The EUA report makes several recommendations for ranking-makers, including the need to mention what the ranking is for, and for whom it is intended. Among the suggestions to improve the rankings, the following received the greatest attention from the audience:
  1. Include non-journal publications properly, including books, which are especially important for social sciences and the arts and humanities;
  2. Address language issues (is an abstract available in English, as local language versions are often less visible?);
  3. Include more universities: currently the rankings assess only 1–3% of the 17,000 existing universities worldwide;
  4. Take into consideration the teaching mission with relevant indicators.

The first of these may become feasible now that Thomson Reuters has a book citation index. The second and third are uncontroversial. The fourth is very problematical in many ways.

The missing indicator here is student quality. To be very blunt, universities can educate and instruct students but they can do very little to make them brighter.  A big contribution to any university ranking would be a comparison of the relative cognitive ability of its students. That, however, is a goal that requires passing through many minefields.

Thursday, December 01, 2011

Diversity and Rankings

Robert Morse, director of data research at the US News and World Report discusses the question of whether "diversity" should be included in the ranking of American law schools.

"I was one of speakers on the "Closing Plenary: Reforming U.S. News Rankings to Include Diversity" panel, which discussed many of the issues pertaining to whether U.S. News should add a measure of diversity directly into the Best Law Schools rankings. I pointed out that U.S. News believes diversity is important and that is why we all ready publish a separate law school diversity index.

Our current index identifies law schools where law students are most and least likely to encounter classmates from a different racial or ethnic group. However, the current index does not measure how successful each law school is at meeting a diversity goal or benchmark at the school, state, local, or national level. It also gives schools enrolling one ethnic group a low score, though that school's enrolment may match its state's ethnic population or the school may be a Historically Black College or University. It's for these reasons the current index would not be appropriate to add into the rankings".

Diversity here does not mean diversity of ideology, religion, class, politics or nationality. It simply means the numbers of recognised minorities, mainly African-Americans, Hispanics and Asian Americans.

It is interesting to look at the diversity index and to see the likely effect of including diversity in the law school rankings. The most diverse law school is the University of Hawaii. The University of the District of Columbia and Florida International University also get high scores. Low scorers include Harvard, Yale and UCLA.

Somehow, I do not think that an indicator that benefited Florida International University at the expense of Harvard would add to the credibility of these rankings.

Unless it can be demonstrated that there is something magically transforming about the statistical profile of a law school reflecting that of its city, state, or nation or future nation, this proposal does not sound like a very good idea.
The Utility of Rankings

Another advantage of a good performance in international university rankings is that graduates will be able to get into Russian postgraduate programs (if your university is in a G8 country).


Russia’s education ministry is currently drawing up a list of foreign universities whose qualifications will be recognized.

The list will include only universities located within the G8 countries that enter the top 300 in the Academic Ranking of World Universities or the QS World University Rankings. Officials say there will be around 300 institutions meeting the criteria.

The reform is intended to attract more students to take part in Russian MA and PhD programs.