Posts Subscribe comment Comments

Get paid To Promote at any Location

Thursday, March 30, 2006

Friendly advice to new UM VC

Since the current hot topic is the appointment of a new UM VC, perhaps a few words of friendly advice are relevant. And since the previous VC was (in)famous for bragging about the fact that UM was among the top 100 and later top 200 universities in the world, let us revisit those rankings again.

I have to thank Dr.Richard Holmes, who sent me a link to his website on university rankings before my prelim exams and I apologize to him for the delay in publishing this post. He has
an excellent article which was published in the Asian Journal of University Education on the shortcomings of the THES ranking system and I want to highlight some salient points for the benefit of the readers and perhaps for Dr Sharifah Hapsah, if she ever gets to read this blog. (Dr. Holmes is currently teaching at MARA)

His article notes that the firm that was given the task of compiling the rankings, a certain QS Quacquarelli Symonds, "does not seem to have any specialized knowledge of research and teaching in the natural and social sciences or the humanities". Rather, it is a company that specializes in the promotion of MBA programs and executive recruitment. The fact that it has offices in Washington DC, Paris, Beijing, Singapore, Tokyo and Sydney, Dr. Holmes suggest, can partly explain the bias of the rankings towards universities in certain countries.

This company's ignorance of university education worldwide is shown using two examples. The first is their mistake in coding non-Malays in Malaysian universities as foreigners which this blog highlighted even before the admission of this mistake by QS. The second example that Dr. Holmes give is the fact that the company listed "Beijing University" as the top university in Asia even though, strictly speaking, there is no such thing as "Beijing University". A simple google search of "Beijing University" would reveal that it is actually "Peking University" and that there are a number of specialist "Beijing" universities in different fields which are not associated with "Peking University", the premier university in China.

In addition, "QS’s managing director, Nunzio Quacquarelli, is on record as telling a meeting in Malaysia that the reason for the contrast between Beijing University’s stellar score on the peer review and its score of zero for citations of research was that “they probably published their work in Mandarin but we just couldn’t find the journals” (New Straits Times, 22/11/2005). Had they looked for research from Peking University, which is how researchers describe their affiliation in academic journals, they would have found quite a bit. It looks as though some people in QS were unaware of the university’s official name." (Holmes, 2006) This makes me wonder about the ability of QS to conduct a survey of this nature. Did they offer a good 'rate' to the Guardian? Was it a quid pro quo thing? Why not ask a survey firm such as AC Nielson who have offices in many more countries than QS and would presumably be more experienced in conducting surveys of this nature? (My suspicion is that there larger, global survey firms were probably too expensive)

Dr. Holmes brings up the point that the peer review category, which constitutes 40% of the overall score, is the most problematic category out of all the categories used in the THES ranking. It lacks transparency especially in regards to the selection and sample size of the participants in the peer review survey.

Dr. Holmes' criticism probably confirms what other people who have examined the rankings in depth feel - "It is difficult to avoid the suspicion that the peer review was based on convenience sampling, with QS simply asking those that they had come across during their consultancy activities. This would explain the presence in the top 200 of several apparently undistinguished universities from France, Australia and China where the consultants have offices and the comparative scarcity of universities from Eastern Europe, Israel, Taiwan and Canada where they do not."

Dr. Holmes also makes a convincing point that the "the peer review is not really an international ranking since academics where asked "to name “the top universities in the subject areas and the geographical regions in which they have expertise.” In other words Chinese physicists, we can only assume, were not asked to name the best university for physics in the world but to name the best university for, say, nuclear physics in Asia, maybe even just in China. If this is the case, these are not then world rankings."

Dr. Holmes brings up many more criticisms of the THES survey including the dubious methodology of recruiter ratings, the theoretical foundations of using measures of international students and faculty and the bias against the social sciences and humanities in the citation score. I'd encourage anyone who is interested in the THES rankings to read his article in depth and I'd definitely encourage the new UM VC and the new Minister for Higher Education to read this article.

Finally, there are two ways in which Malaysian universities can improve its ranking among the top universities in the world. One way is easier and the other way takes more effort and will entail painful institutional changes.

The easy way includes taking the following measures, assuming that QS continues to conduct the THES survey using a similar methodology:
1) Have a lot of tie-ups with other universities to offer a multitude of MBA courses. Better yet, have tie-ups with universities that are clients of QS. That is a sure fire way to be featured more prominently and positively in the radar screen of QS.
2) Track down the academic 'experts' which QS uses for the peer review and offer them 'incentives' to rank UM highly.
3) Hire a bunch of foreign lecturers regardless of their qualifications.
4) Open up places in local varsities to foreign students regardless of their qualifications.
5) Have the local universities direct QS to employers that will rate the local universities favorably. Better yet, go directly to these employers and offer them 'incentives' to rate the local varsities highly.

The second way involves painful institutional changes but will ensure a genuine improvement in the quality of Malaysian universities in the medium to long term, regardless of the methodology used or the consultant employed to compile these rankings:
1) Hire, fire and promote lecturers based on academic work using objective criterion such as publications in highly acclaimed journals or the publication of widely acknowledge books and research in the field.
2) Make appointments to positions of administrative leadership (VC, deputy VC, heads of departments) based on ability to improve academic standards and other objective criterion that is linked to academic standards.
3) Based on the above two recommendations, hiring and appointment policies should be race-blind.
4) Create incentives for raising private funds / donations to the local universities so that resources and infrastructure can be improved and better pay can be awarded to distinguished faculty members.

5) Create incentives for members of the academia to work with the private sector on research projects so as to obtain external funding as well as to leverage the expertise available in the private sector

So, which path do you think is most likely to be taken?

No comments:

Post a Comment