Tuesday, May 31, 2016

UK rises in the U21 system rankings

The comparison of national higher education systems by Universitas 21 shows that the UK has risen from 10th place to 4th since 2012.

These rankings consists of four groups of indicators: resources, connectivity, environment and output. Since 2012 the British higher education has risen from 27th to to 12th for resources, 13th to 10th for environment  and 6th to 4th for connectivity. It was in second place for output in 2012 and in 2016 but its score rose from 62.2 to 69.9 over the four years.

Every few months, whenever any sort of ranking is published, there is an outcry from British universities that austerity and government demands and interference and immigration controls are ruining higher education.

If the U21 rankings have any validity then it would seem that British universities have been very generously funded in comparison to other countries.

Perhaps they could return some of the money or at least say thank you to the state that has been so kind to them.


Monday, May 23, 2016

Does a programming competition in Thailand show the future of the world economy?



The ACM ICPC (Association for Computing Machinery -- Intercollegiate Programme) is the "Olympics of Programming Competitions". The competitors are teams of university students who grapple with complex real-world problems. It is a "battle of logic, strategy and mental endurance" and is the apex of a series of local and regional competitions.

Success in the competition requires a high level of intelligence, genuine team formation and rigorous training. It is the antithesis of the intellectual levelling and  narcissistic cult of safe spaces that has infected American and, perhaps to a lesser extent, British universities.

The finals have just been completed in Thailand. The top five are:

1.  St. Petersburg State University
2.  Shanghai Jiao Tong University
3.  Harvard University
4.  St. Petersburg Institute of Physics and Technology
5.  University of Warsaw

The list of universities in the top ten, the top fifty and the total number of finalists is interesting. If this competition reflects the current level of intelligence of university students, then the future for China, patches of the rest of Asia, and Russia and Eastern Europe looks bright. The USA may do well if -- a very big if -- it can continue to attract large numbers of Chinese students and immigrants. For Africa and Western Europe, including the UK, the economy of the 21st century may be bleak.

Below, countries are ranked according to the number of universities in the top ten, the top fifty and all finalists.


Rank
Country
Top 10
Top 50
Total Finalists
1
Russia
5
10
12
2
USA
2
6
23
3
Poland
2
3
3
4
China
1
10
17
5
Brazil

2
6
6
Japan

2
4
7
Ukraine

2
3
8=
Belarus

2
2
8=
Taiwan

2
2
10
Bangladesh

1
3
11=
Canada

1
2
11=
Iran

1
2
11=
South Korea

1
2
11=
Vietnam

1
2
15=
Argentina

1
1
15=
Croatia

1
1
15=
Finland

1
1
15=
Hong Kong

1
1
15=
North Korea

1
1
15=
Singapore

1
1
21
India


6
22
Egypt


4
23=
Mexico


3
23=
Syria


3
25=
Australia


2
25=
Colombia


2
25=
Netherlands


2
28=
Chile


1
28=
Cuba


1
28=
Czech Republic


1
28=
Jordan


1
28=
Macao


1
28=
Pakistan


1
28=
Peru


1
28=
Philippines


1
28=
Slovakia


1
28=
South Africa


1
28=
Spain


1
28=
Switzerland


1
28=
Thailand


1
28=
UK


1
28=
Venezuela


1

Sunday, May 22, 2016

Don’t rush to conclusions from the THE rankings

My 15th May post has been republished in University World News with a different title.

No need for British universities to worry about their dip in THE reputation rankings

Don’t rush to conclusions from the THE rankings

Sunday, May 15, 2016

One more thing about the THE reputation rankings

I have just remembered something about the THE reputation reputation rankings that is worth noting.

THE have broken out the scores for teaching reputation and research reputation for the first fifty universities and this gives us a chance to ask if there is any meaningful difference between teaching and research reputation.

The answer is that there is not. The correlation between the teaching and the research scores is .986. This is so high that for practical purposes they are exactly the same thing. The 15% weighting given for teaching (actually "postgraduate supervision") reputation may be unrelated to undergraduate teaching or even to taught master's teaching.The emphasis on research in the THE world rankings is therefore even higher than THE claim, at least at the top.

This has already been pointed out by Alex Usher of High Education Strategy Associates of Canada who found a correlation of .991 in 2013.

The THE reputation rankings: Much ado about not very much

Every so often, especially in North America and Western Europe, there is a panic about the impact of government policies on higher education, usually the failure to provide as much money as universities want, or sometimes as many overseas students as they need to fill lecture halls or cover budget deficits. Global university rankings have a lot to do with the onset and spread of these panics.

True to form, the British  "quality" media have been getting into a tizzy over the latest edition of the Times Higher Education (THE) world reputation ranking. According to Javier Espinoza, education editor of the Telegraph, top UK universities have been under pressure to admit minority and state school students and have also had difficulty in recruiting foreign students. This has somehow caused them to forget about doing research or teaching the most able students. It seems that academics from countries around the world, where such problems are of course unknown, are reacting by withholding their votes from British universities when responding to the THE survey and transferring their approval to the rising stars of Asia.

This supposedly has caused UK institutions to slide down the rankings and two of them, Bristol and Durham, have even dropped out of the top 100 altogether into the great dark pit of the unranked.

The Guardian notes that Oxford and Cambridge are falling and are now only just in the world's top five while the Independent quotes Phil Baty, as saying that "our evidence - from six massive global surveys over six years, including the views of more than 80,000 scholars - proves the balance of power in higher education and research is slowly shifting from the West to the East". 

This, it would seem, is all because of cuts in funding and restrictions on the entry of overseas students and faculty.

All this is is rather implausible. First of all, these are reputation rankings. They refer only to one indicator that accounts for 33 percent of the World University Rankings that will appear later this year. It is not certain that the other indicators will go in the same direction.

Secondly, these rankings have not been standardised as they will be when included in the world rankings, which means that the huge gap between the Big Six, Harvard -- MIT, Berkeley, Stanford, Oxford and Cambridge -- and the rest is laid bare, as it will not be in the autumn, and so we can get a rough idea of how many academics were voting for each university. A crude guess is that when we get down to around 50th place the number of votes will be around five hundred and even less when we reach 100th place.

This means that below the 50 mark a shift in the opinion of a few dozen respondents could easily push a university up or down into a new band or even into or out of the top 100.

Another thing we should remember is that the expertise of the researchers in the Scopus database, from which respondents are drawn, is  exaggerated. The qualification for receiving a survey form is being the corresponding author of a publication listed in the Scopus database. There is much anecdotal evidence that in some places winning research grants or getting the corresponding author slot has more to do with politics than with merit. The THE survey is better than QS's, which allows anyone with an academic email address to take part, but it does not guarantee that every respondent is an unbiased and senior researcher.

We should also note that, unlike the US News and QS survey indicators, THE takes no measures to damp down year to year fluctuations. Nor does it do anything to prevent academics from supporting their own universities in the survey.

So, do we really need to get excited about a few dozen "senior researchers" withdrawing their support from British universities?

The credibility of these rankings is further undermined by apparent changes in the distribution of responses by subject group. According to the methodology page in Times Higher Education for 2015, 16% of the responses were from the arts and humanities and 19% were from the social sciences, which in that year included business studies and economics. This year, according to the THE methodology page, 9% of the responses were from the arts and humanities and 15 % were from the social sciences and 13 % were from business and economics, adding up to 28%.

In other words the responses from the arts and humanities have apparently fallen by 7 percentage points, or around 700 responses, and the combined responses from social sciences and business and economics have apparently risen by nine points, or about 900 responses.

If these numbers are accurate then there has been among survey respondents a very substantial shift from the arts and humanities to the social sciences (inclusive of business and economics) and it is possible that this could be sufficient to cause the recorded decline in the reputation scores of British universities which usually do much better  in the arts and humanities than in the social sciences.

In the THE subject group rankings last year, Durham, for example, was 28th for arts and humanities in the THE 2015-16 World University Rankings and 36th for the social sciences. Exeter was 71st for arts and humanities and 81st for the social sciences.

At the same time some of those rising  Asian universities were definitely  stronger in the social sciences than in the humanities: Peking was 52nd for social sciences and 84th for arts and humanities, Hong Kong 39th for social sciences and 44th for arts and humanities, Nanyang Technological University 95th for social sciences and outside the top 100 universities for the arts and humanities.

It is possible that such a symmetrical change could be the result of changes in the way disciplines are classified or even a simple transposition of data. So far, THE have given no indication that this was the case.

It is interesting that an exception to to the narrative of British decline is the London Business School which has risen from the 91-100 band to 81-90.

The general claim that the views of 80,000 academics over six years are evidence of a shift from west to east is also somewhat tenuous. There have been several changes in the collection and organisation of data over the last few years that could affect the outcomes of the reputation  survey.

Between 2010-2011 and 2016 the percentage of responses from the social sciences (originally including  business and economics) has risen from 19% to 28 % for social sciences plus business and economics counted separately. Those for clinical and health sciences and life sciences  have fallen somewhat while there has been a slight rise for the arts and humanities, with a large spike in 2015.

The number of responses from the Asia Pacific region and the Middle East has has risen from 25% to 36% while those from the Americas (North and Latin) have  fallen from 44% to 25%. The number of languages in which the survey is administered has increased from eight in 2011 to fifteen this year.

The source of respondents has shifted from the Thomson Reuters Web of Science to Scopus, which includes more publications from languages other than English.

The value of these changes is not disputed here but they should make everybody very cautious about using the reputation rankings to make large claims about what is happening to British universities or what the causes of their problems are.