Dr Ismail Aby Jamal

Dr Ismail Aby Jamal
Born in Batu 10, Kg Lubok Bandan, Jementah, Segamat, Johor

Thursday, October 8, 2009

World University Rankings 2009


World University Rankings 2009
Rankings 09: Talking points
8 October 2009
By Phil Baty
The World University Rankings are compiled using a mixture of quantitative indicators and informed opinion
What makes a world-class university? When Times Higher Education asked the leaders of top-ranked institutions this question last year, one response stood out for its inspirational qualities.
Robert Zimmer, president of the University of Chicago, said that his institution was "driven by a singular focus on the value of open, rigorous and intense inquiry. Everything about the university that we recognise as distinctive flows from this."
He said that Chicago believed that "argumentation rather than deference is the route to clarity", that "arguments stand or fall on their merits" and that the university recognised that "our contributions to society rest on the power of our ideas and the openness of our environment to developing and testing ideas".
His answer prompted much praise. One Times Higher Education reader said that Zimmer's "glorious affirmation" was "marvellously refreshing" and had "brought joy to my heart, tears to my eyes and a renewed sense of commitment to the life of the mind".
But glorious as Zimmer's statement was, it also served to highlight the problem faced by the increasing number of people and organisations now in the business of ranking higher education institutions: how on earth do you measure such intangible things?
The short answer, of course, is that you cannot. What you can do, however, and what we have sought to do with these rankings, is to try to capture the more tangible and measurable elements that make a modern, world-class university.
When Times Higher Education first conceived its annual World University Rankings with QS in 2004, we identified "four pillars" that supported the foundations of a leading international institution. They are hardly controversial: high-quality research; high-quality teaching; high graduate employability; and an "international outlook".
Much more controversial are the measurements we chose for our rankings, and the balance between quantitative and qualitative measures.
To judge research excellence, we examine citations - how many times an academic's published work is cited.
We calculate this element - worth 20 per cent of the overall score - by taking the total number of citations for all papers published from the institution, and then dividing the figure by the number of full-time equivalent staff at the institution. This gives a sense of the density of research excellence on a campus.
Our proxy for teaching excellence is a simple measure of staff-to-student ratio. It is not perfect, but it is based on data that can be collected for all institutions, often via national bodies, and compared fairly. Our assumption is that it tells us something meaningful about the quality of the student experience. At the most basic level, it at least gives a sense as to whether an institution has enough teaching staff to give students the attention they require. This measure is worth 20 per cent of the overall score.
To get a sense of a university's international outlook, we measure the proportion of overseas staff a university has on its books (making up 5 per cent of the total score) and the proportion of international students it has attracted (making up another 5 per cent). This gives an impression of how attractive an institution is around the world, and suggests how much the institution has embraced the globalisation agenda.
But 50 per cent of the final score is made up from qualitative data from surveys of informed people - university academics and graduate employers.
The fundamental tenet of this ranking, as we have said in previous years, is that academics know best when it comes to identifying the best institutions.
So the biggest part of the ranking score - worth 40 per cent - is based on the result of an academic peer review survey. We consult academics around the world, from lecturers to university presidents, and ask them to name up to 30 institutions they regard as being the best in the world in their field.
Responses over the past three years are aggregated, although only the most recent response from anyone who has responded more than once is used. For our 2009 tables, we have drawn on responses from 9,386 people. With each person nominating an average of 13 institutions, this means that we can draw on about 120,000 data points.
The ranking also includes the results of an employer survey of 3,281 major graduate employers, making up 10 per cent of the overall result.
• Times Higher Education-QS World University Rankings 2009: full coverage and tables
THE SCORECARD
Something to talk about
For a university to be considered for the ranking, it must operate in at least two of five major academic fields: natural sciences; life sciences and biomedicine; engineering and information technology; social sciences; and arts and humanities. It must also teach undergraduates, so many specialist schools are excluded.
We do not pretend to be able to capture all of the intangible nuances of what it is that makes a university so special, and we accept that there are some criticisms of our methodology.
These rankings are meant to be the starting point for discussions about institutions' places in the rapidly globalising world - and how that is measured and benchmarked - not the end point. We encourage that discussion.
phil.baty@tsleducation.com.
Readers' comments
Observer 8 October, 2009
Tell me how University of Toronto beat McGill in every area of expertise and then is ranked lower globally? Is this some kind of twisted logic?
john 8 October, 2009
sorry, but this ranking is simply inconceivable, particularly when one looks at how US universities are ranked against each other.
OBSERVRESS 8 October, 2009
i can't quite understand how the ANU can beat SYDNEY in particular in the arts and humanities
Scott 8 October, 2009
Hilarious! There was a good one in The Onion today too. Keep it up, I can always use another laugh.
Bitwize 8 October, 2009
One reservation with the 2hole ranking system is that the rankings are becoming the major focus for university's being, and part of the wider zeitgeist of 'bums on seats, iinternational bums preferably as we can screw them for huge tuition fees.' Its a bit like going to the horse racing; its a gambling game, play it well and it means more money for the Universitywhihc translates as : a larger car and loads more perks and freebies for the Principal and his/her cronies. The onus also turns to the building up of huge endowment funds on the US model, by whatever means possible, i.e hedge funds, dodgy multinationals. Nouveau philanthropy rears its ugly head with private donors and companies driving their vision of what education should be for their own narrow ends. . Another aspect of this is the way some unis, in the second hundred who are desperatey trying to break into the top 100, giving huge financial incentives to attract prestigious part time academics to give an impression of 'esteemed scholars' . Lower lever staff, whose toils do little to affect the rankings, are of course still expected to struggle along on their pauper's wages. Of course the international focus does bring one other major advantage: lots of free 'fact finding/networking ' trips abroad for senior management.
andy B 8 October, 2009
cue press releases across the world - University of Poppleton confirmed as 130th ebst University in THE WORLD, up from 134th last year. followed by quote from VC. None of this actually has much use for prospective students as the measures do not include graduate success (the employer review is more related to how much Universities lobby their close employer contacts to vote for them, and how many in the same company you can get to vote for you as there is no limit), student satisfacton (staff student ratios are not an indication of course quality) or progression rates/student performance (impossible to compare accurately across the world). Shame that the Times Higher places so much store on them presumably to partly generate global income from ads to offset the falling sales from the UK Universities - who use cheaper means like social networking, internet and jobs.ac.uk to get their message across and advertise jobs. best of luck to you all - from a University in the top 250.....hurrah! We CAN make the top 200 next year, c'mon!. ps it is also about size -the bigger you are is crucial, even though in the UK it is the smaller Unis with less than 15,000 students who often have the highest levels of student satisfaction and other positive measures, AND are strong for research.
G. H. 8 October, 2009
This ranking is fundamentally flawed. On the one hand it tries to - in its own words - "capture the more tangible and measurable elements that make a modern, world-class university", only to make up 50% of the score by surveys which are anything but tangible and measurable and prone to emphasize name recognition rather than measurable parameters. It is no wonder the same angloamerican universities always end up top simply because they enjoy a much higher name recognition. Further, the article does not reveal WHO is surveyed - surely the selection of those surveyed will also have a direct impact on the results of the ranking.
Ibrahim J. Barrie 8 October, 2009
Please explain to me as to how Utrecht University and the University of Amsterdam come to respectively beat VU University Amsterdam and Delft University Technology. Knowing what I know, I cannot just see the reasons.
Daniel Dabney 8 October, 2009
Let's see--U.C. Berkeley is: 2nd in engineering and IT; 5th in life sciences; 3rd in natural sciences; 2nd in social sciences; 4th in arts and humanities; and 39th overall. Hmmm. They must have a really lousy football team.
Kaun 8 October, 2009
Science is not bag potatoes to be weighted exactly :)
Christina Catana 8 October, 2009
Where exactly do I find the evaluated factors? I have the impression that this ranking combines economic and scientific factors in a way, that does not express academic excellence as I would define it. Are the criteria and sources of information evaluated for the ranking availabe somewhere?
Jonathan Davies 8 October, 2009
These Rankings are so flawed. Once again you have incorrectly calculated the citation score for the LSE, and as a result it is languishing in the 60's. How can a world class institution such as the LSE, widely regarded as superior in the UK to UCL (which is 4th), continue to be so low, especially as the LSE have even pointed out the grave error to yourselves in written correspondence. It was 11th a few years back. Can a university dropping 50 odd places really be possible. Sort it out as it is a sort if cheating on your part.
a joke 8 October, 2009
This is a joke, 40% of the score comes from a 9386 responses survey? just a joke, a BRITISH JOKE. LOL
Ovi 8 October, 2009
Stanford #16 ? UC Berkeley ranks lower than Sydney, ANU ?? hahah a nice joke indeed.
Nick 8 October, 2009
THES university rankings have always been a joke. When it comes to research the ranking produced by Shanghai Jiao Tong University is rightly regarded as the most authoritative. They give much more weight on citations and avoid fuzzy data like questionnaires. There are only 2 UK universities in the top 20 in their rankings. When it comes to student satisfaction I wonder how THES consistently avoids using the most obvious input - the students themselves. If they don't trust the students to be able to evaluate their universities they could at least use a meaningful proxy like the self-reported starting salaries (as is traditionally done by US MBA schools). Also there seems to be over the years a consistent UK/Commonwealth bias when it comes to the survey among academics. I wonder how the sample is constructed. As result, the ranking is full obvious non-sense like UCL's 4th place and Imperial's 5ths place vs. Stanford's 17th and Berkeley's 36th (!). It is as if the survey has taken place in Fawlty Towers.
Nick 8 October, 2009
Another reason why Shanghai Jiao Tong University (SJTU) is better reputed among academics than that of THES is that the Chinese universities fare rather badly there. So there does not seem to be any behind-the-scenes tweaking of factor weights. The main difference between the SJTU ranking and that of THES is the latter's excessive reliance on fuzzy and easily manipulatable data like surveys. You can get almost any result by tweaking the sample of survey respondents. And, incidentally, the UK universities appear to be doing better in this ranking than on any else. I'd suggest that THES simply does a survey on its page asking one simple question: "Do you think there is a UK bias in this ranking?" I think the result would speak for itself.
Logan 8 October, 2009
In fact, all the rankings are very subjective, especially when sponsored by commercials. If you open the rankings by polling, I believe all the chinese universities will be on the top. However, I really value this ranking as a parent. Good work done anyway.
Kevin 8 October, 2009
This is a useful ranking, particularly given the global nature of today's tertiary education. The THES rankings are especially meaningful because of their peer-review element. It is odd that Nick promotes the Shanghai as better reputed among academics (it most certainly is not). The Journal Scientometrics recently published papers showing that Shanghai rankings are not even reproducable using they actual raw data and methods of Liu and Cheng. Obviously, all such tables will have limitations, but their utility is quite obvious. By the way: Well done Princeton - back in the top ten!

No comments: