Content of review 1, reviewed on July 20, 2017
The authors have forgot the several critiques made by several authors about QS and THE rankings. For instance Huang (2012) wrote that “QS Rankings might lack validity….regional bias could affect the ranking results greatly…the results of QS Rankings were impacted heavily by the number of return questionnaire from each country. Because of the home bias, the number of return questionnaires from the UK ranked the second highest, and the performance of its universities was also outstanding…In the process of calculating return questionnaire for university reputation, QS Rankings failed to control the number and qualification of questionnaire, thus leading to a selection bias”. On the other hand Marginson (2013) use eight criteria to judge global rankings concerning social sciences. Six criteria were related to social science quality and two concerned behavioural effects. The social science criteria are materiality, objectivity, externality, comprehensiveness, particularity and ordinal proportionality. The behavioural criteria are the alignment of the ranking with tendencies to improved performance of all institutions and countries, and transparency. QS ranking show weak performance in 5 criteria, medium-weak in 2 criteria and medium in 1 criteria being the raking with the worst performance concerning social sciences. The second worst was THE ranking. Huang, M. H. (2012). Opening the black box of QS World University Rankings. Research Evaluation, 21(1), 71-78. Marginson, S. (2014). University rankings and social science. European Journal of Education, 49(1), 45-59.
Source
© 2017 the Reviewer (CC BY 4.0).