Australian Government/ HEFCE/ European Commission 16 October 2015, Singapore Measuring Quality Outcomes in Higher Education Simon Marginson Professor of International Higher Education Director, ESRC/HEFCE Centre for Global Higher Education (Nov 2015) UCL Institute of Education University College London, UK
Coverage High Participation Systems (HPS) of higher education: Participation and system stratification Research environment What are the systemic drivers of quality operating at global level? Present global rankings The lacuna: Credible comparative metrics on learning in higher education Problems of proxies and indirect measures Post-AHELO landscape
A time of movement and change in world higher education
1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 Participation is growing at 1% a year Gross Tertiary Enrolment Ratio 1970-2012 (UNESCO 2015) 90 80 70 60 50 40 30 20 10 0 31 32 10 10 10 10 11 11 12 12 12 12 12 13 13 13 13 13 13 13 13 13 14 14 14 14 15 15 16 17 17 18 19 20 21 22 23 24 25 26 27 28 29 World North America and Western Europe Sub-Saharan Africa
Same trend across the globe: GTER by world region, 1995/2012 North America & Western Europe 60 79 Central & Eastern Europe 33 71 Latin America & Carribean 17 43 East Asia & Pacific 10 31 Arab States 14 26 Central Asia 23 25 South & West Asia 6 23 Sub-Saharan Africa 4 8 WORLD 15 32 2012 1995 0 10 20 30 40 50 60 70 80 90
Growth in participation to come World and Asian middle class 2009-2030 (billions) Source: Brookings / OECD projection in 2010 5 4.88 4 3 2 1.85 3.25 1.74 3.23 1 0 0.53 2009 2020 2030 World Asia Middle class persons are defined as persons living on USD $10-100 per day, PPP
Tendency towards bifurcation and stratification of HPS Elite HEIs student selecting upward push (aspirations) downward pull (scarcity of resources and status) Middle Sector Non-elite demand absorbing
Stratifying effects in system design Competition World-Class University movement Under-regulation of quality in mass HE sectors More fragmented and diverse offerings, cross-border etc Indifference to equity issues (access to elite HEIs, cognitive formation in mass sector, etc) BUT POLICY COUNTER-WEIGHTS Horizontal system design features, system architectures Common mission designations ( research university ) Sector-wide promotion nationally and offshore Funding parcels, nuanced specialisations sustain middle sector
Students enrolled outside their country of citizenship, millions, 1975-2012 OECD data, 2014 5 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 4.5 4.1 3.0 2.1 1.7 1.3 1.1 1.1 0.8 1975 1980 1985 1990 1995 2000 2005 2010 2012
Students enrolled outside their country of 5 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 citizenship, millions, 1975-2012 other OECD, 8% Belgium, 1% Switzerland, 1% Korea, 1% Netherlands, 1% South Africa, 2% Austria, 2% Italy, 2% 0.8 New Zealand, 2% China, 2% 1.1 1.1 Spain, 2% OECD data, 2014 other non-oecd, 17% 1.3 Russia, 4% Japan, 3% 1.7 Canada, 5% 1975 1980 1985 1990 1995 2000 2005 2010 2012 2.1 3.0 France, 6% 4.1 4.5 USA, 16% UK, 13% Australia, 6% Germany, 6%
Leiden Ranking world top 15 Universities ranked 1-15 in world Papers 2010-2013 % papers in top 10% Number of papers in top 1% top 10% 1 Harvard U USA 31,137 22.1 1026 6892 2 Stanford U USA 14,102 21.9 442 3083 3 U Toronto Canada 19,948 13.7 289 2738 4 U Michigan USA 17,283 15.1 264 2616 5 U California, Berkeley USA 11,804 21.8 360 2573 6 MIT USA 10,040 24.8 400 2486 7 U California, Los Angeles USA 14,002 17.4 301 2438 8 Johns Hopkins U USA 14,850 15.8 293 2348 9 U Oxford UK 12,935 17.8 293 2301 10 U Washington, Seattle USA 13,716 16.6 267 2276 11 U Pennsylvania USA 12,649 17.2 269 2178 12 U California San Diego USA 11,707 18.1 276 2124 13 U Cambridge UK 12,170 17.3 279 2100 14 Columbia U USA 11,807 17.5 261 2064 15 U California, S. Francisco USA 10,199 19.8 264 2017
Stronger research universities in Europe 27 39 41 45 46 48 51 57 59 63 68 72 74 79 European universities in ARWU top 80 in 2004 (14) Fed Instit Tech Zurich SWITZERLAND U Utrecht NETHERLANDS Paris 6 P&M Curie FRANCE TU Munich GERMANY Karolinska Instit SWEDEN Paris 11 Sud FRANCE U Munich GERMANY U Zurich SWITZERLAND U Copenhagen DENMARK Leiden U NETHERLANDS U Oslo NORWAY U Helsinki FINLAND Uppsala U SWEDEN U Goettingen GERMANY European universities in ARWU top 80 in 2015 (19) 20 35 36 41 46 48 51 52 54 56 58 eq 58 eq 61 eq 67 eq 71 72 73 75 Fed Instit Tech Zurich SWITZERLAND U Copenhagen DENMARK Paris 6 P&M Curie FRANCE Paris 11 Sud FRANCE Heidelberg U GERMANY Karolinska Instit SWEDEN TU Munich GERMANY U Munich GERMANY U Zurich SWITZERLAND U Utrecht NETHERLANDS U Geneva SWITZERLAND U Oslo NORWAY Uppsala U SWEDEN U Helsinki FINLAND Ghent U BELGIUM Ecole Normale Superieure FRANCE Aarus U DENMARK U Groningen NETHERLANDS Stockholm U SWEDEN
Dynamism in China and Singapore: top 10% papers 2006-09 to 2010-13 (Leiden) 1400 1200 1000 800 600 400 200 0 2006-09 2007-10 2008-11 2009-12 2010-13 NU Singapore Nanyang UT Tsinhgua U Zhejiang U Peking U
Growth top 10% papers, 2006-09 to 2010-13 university system 2006-09 2007-10 2008-11 2009-12 2010-13 growth NU Singapore SINGAPORE 1042 1094 1173 1264 1374 31.9% Nanyang TU SINGAPORE 568 640 776 910 1103 94.2% Tsinghua U CHINA 819 875 953 1031 1217 48.6% Zhejiang U CHINA 730 780 896 994 1182 61.9% Peking U CHINA 622 705 773 867 1026 65,0% Shanghai JT U CHINA 664 698 771 901 1020 53.6% Fudan U CHINA 469 536 638 727 891 90.0% U S&T China CHINA 503 509 536 576 675 34.2% U Hong Kong HONG KONG 558 578 622 643 661 18.5% Seoul National U KOREA 742 768 812 911 984 32.6% National Taiwan U TAIWAN 604 613 647 660 691 14.4% MIT USA 2091 2142 2260 2391 2486 18.9% U Cambridge UK 1796 1867 1975 2080 2100 16.9%
What drives quality? Top end competition for research rankings (takes in only top 500, i.e. T1 universities and aspirants) generates continuous increase in paper outputs and improved cite rates Competition for foreign students, plus national QA and management of student satisfaction generates continuous improvement in services for foreign students National competition, QA and management of student satisfaction generates improvement in student servicing in some HPS but absence of a global dynamic means this is weaker than research effects Nothing drives continuous improvement in student learning
Global rankings and quality ARWU big science. Works. Leiden and Scimago allow fine-tuning of research management (REAL METRICS MATTER) Times Higher more than two thirds research driven QS and TH surveys no link to performance. No inherent quality driver (there s a marketing driver) Multi-indicator rankings of QS and TH incoherent, no validity, no link to performance (separate indicators would be OK) Teaching proxies are staffing ratios (???) and surveys of who is good at teaching (?????? Why do we tolerate this?)
Lacuna in learning measures in HE Impact of PISA as a driver of performance Not transparency in learning function No useful information to guide choice No performance drivers for T2 and T3 HEIs In the context of research-led competition and absence of learning achievement measures talk about parity (or primacy) of teaching is vapid Proxies create wrong incentives and we have no way of telling they create right incentives Real, grounded metrics matter. Nothing else works
Proxies don t get us there Resource measures no necessary relation to quality Graduate employment data are important but do not give us learning quality (human capital theory is a metaphor not an individualisable technology and it cannot be reverse engineered) Indirect outcomes assessment (student surveys of learning behaviour, student satisfaction, student engagement) etc a small part of what we need, BUT Management by satisfaction drives down cognitive formation, standards (Arum & Roksa 2014, Armstrong & Hamilton 2014): most students now believe it is not what you know it is who you know Cognitive formation does not sit well with consumer sovereignty but is crucial to bright poor student
Measuring learning outcomes Three aspects - general cognitive formation, CLA-type tests - discipline-based learning - work-related generic skills, nested in each occupation Robust faculty cultures to build discipline measures and work skill measures on crossborder basis Measures cannot provide holistic/total outcomes Use separate indicators and resist the pressure for combined indicator-based tables Don t combine with research measures in single tables (unless through user customization)
Post-AHELO landscape Moving on Individual university route can t work like governing on the basis of a UN Security Council with 100 permanent members who all want to say no Governments are crucial but only some can deliver Move on CLA-type competency tests in advance of the rest of the package Move in selected countries/regions (East Asia/Singapore?) but crucial to include WCUs