Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability.

Slides:



Advertisements
Similar presentations
HOW TO DEVELOP THE PROJECT IDEA Training unit 2.2 Definition of objectives in the action.
Advertisements

Theory-Based Evaluation:
A NEW METRIC FOR A NEW COHESION POLICY by Fabrizio Barca * * Italian Ministry of Economy and Finance. Special Advisor to the European Commission. Perugia,
Delivering lifelong learning for knowledge, creativity and innovation Draft Joint Progress Report on Education & Training 2010 Education Committee, Lisbon,
The political framework
Lifelong Guidance: A Key to Lifelong Learning – EU Policy Perspective John McCarthy European Commission DG EAC Vocational Training Policy Unit.
ECTS grading scale European Credit Transfer and Accumulation System.
1 Improving School Leadership - Guidelines for Country Background Reports - Education and Training Policy Division Directorate of Education.
European Social Fund Evaluation in Italy Stefano Volpi Roma, 03 maggio 2011 Isfol Esf Evaluation Unit Human Resources Policies Evaluation Area Rome, Corso.
Top 25 Universities Of The World (World University Rankings 2009) Top 25 Universities Of The World (World University Rankings 2009)
IREG-4, Astana, 16 June Rickety Numbers Volatility of international rankings of higher education and implications for policy making ANDREA Saltelli.
See ( OECD-JRC handbook on CI The ‘pros’: Can summarise complex or multi-dimensional issues in view of supporting decision-makers.
1 Quality criteria for data aggregation used in academic rankings IREG FORUM on University rankings Methodologies under scrutiny May 2013, Warsaw,
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
CONFERENCE ON SMALL STATES AND RESILIENCE BUILDING Malta, APRIL 2007 " Weighting Procedures for Composite Indicators " Giuseppe Munda European Commission,
1 Constructing Composite Indicators: From Theory to Practice ECFIN, November 11-12, 2010 Andrea Saltelli Ranking and rating: Woodo or Science? Andrea Saltelli,
Association for the Education of Adults EAEA European AE Research – Look towards the future ERDI General Assembly, 2004.
Ranking universities: The CHE Approach Gero Federkeil CHE – Centre for Higher Education Development International Colloquium “Ranking and Research Assessment.
J. K. Dietrich - FBE 532 – Spring 2006 Course Summary Week 15 – April 27, 2006.
Michaela Saisana Second Conference on Measuring Human Progress New York, 4-5 March “Reflections on the Human Development Index” (paper by J. Foster)
15 April Fostering Entrepreneurship among young people through education: a EU perspective Simone Baldassarri Unit “Entrepreneurship” Forum “Delivering.
MEASURING KNOWLEDGE AND ITS ECONOMIC EFFECTS THE ROLE OF OFFICIAL STATISTICS Fred Gault Statistics Canada Advancing Knowledge and the Knowledge Economy.
Presentation by: Judith St-George - Director General
Assessment of Higher Education Learning Outcomes (AHELO): Update Deborah Roseveare Head, Skills beyond School Division Directorate for Education OECD 31.
Michael Abbott The Impacts of Integration and Trade on Labor Markets: Methodological Challenges and Consensus Findings in the NAFTA Context.
Impact assessment framework
The Role of Citations in Warwick’s Strategy and Improving Them Nicola Owen (Academic Registrar) Professor Mark Smith (PVC Research: Science and Medicine)
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Research and Innovation Research and Innovation 4.3 Innovation Headline Indicator [RESTRICTED TO ERAC MEMBERS] Clara de la Torre, Pierre Vigier, DG Research.
FLLLEX – Final Evaluation
1 FUTURREG Evaluation Objectives and methodology 3nd Steering Committee Meeting Malta, 28/6/2006.
INTERNATIONAL COMPETITIVENESS IN R&D AND INNOVATION Ádám Török Secretary General, Hungarian Academy of Sciences.
Dr C Svanfeldt; DG RTD K2; October 6, Support for the coherent development of policies Regional Foresight in a European Perspective Dr. Christian.
1 Andrea Saltelli, Jessica Cariboni and Francesca Campolongo European Commission, Joint Research Centre SAMO 2007 Budapest Accelerating factors screening.
Draft Guidelines for lifelong guidance policies and systems: a Reference Framework for the EU and for the Commission Recorded presentation for the National.
EU activities for fostering innovation and promoting cluster development C. Schierenbeck Innovation Policy Development Unit European Commission, DG Enterprise.
C.H. Montin, Tbilisi,12 November Tbilisi, 12 November 2014 Developing Regulatory Impact Assessment In Georgia Assessing the impacts Charles-Henri.
League tables as policy instruments: the political economy of accountability in tertiary education Jamil Salmi and Alenoush Saroyan CIEP, June 2006.
Inception report: Feedbacks, problems and answers Van Hamme Gilles IGEAT-ULB Internal Meeting october.
Social performance for the EU BEPA-JRC 1 European Commission Bureau of European Policy Advisers and Joint Research Centre Marcel Canoy, Frédéric Lerais,
ERIM Next Generation Graduate Programme & Networking Dynamics Wilfred Mijnhardt Executive Director Erasmus Research Institute of Management - ERIM Presentation.
1 Is it possible to determine the ‘true’ value of culture and creative industries through empirical research? Dorota Weziak-Bialowolska
EU KLEMS project on Productivity in the European Union Presentation at the Kick Off Meeting 1 st Call Projects 6 FP – Priority 7 and 8 Gerard Ypma (Groningen.
BRAINSTORMING ON MEASURING INNOVATION ON EDUCATION, OECD, Paris 11 June Can creativity be measured? BRAINSTORMING ON MEASURING INNOVATION ON EDUCATION.
European Commission DG Joint Research Centre Formal and informal approaches to the quality of information in integrated.
Bridges for Recognition Leuven January 2005 Presentation: Pierre Mairesse European Commission, DG EAC: Acting Director Youth, Civil Society, Communication.
Changes in the context of evaluation and assessment: the impact of the European Lifelong Learning strategy Romuald Normand, Institute of Education Lyon,
LIFELONG GUIDANCE SYSTEMS: COMMON EUROPEAN REFERENCE TOOLS ELGPN PEER LEARNING ACTIVITY WP2 Prague April 2008 Dr John McCarthy, Director International.
The new EC impact assessment: what for? EUROPEAN TRADE UNION CONFEDERATION Sophie Dupressoir.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
Framework and assessment methodology for policy coherence for development: Draft Report for OECD 16 th June, Paris Nick Bozeat.
League tables as policy instruments: the political economy of accountability in tertiary education Jamil Salmi and Alenoush Saroyan 2 nd IREG Meeting Berlin,
Regional Policy Guidance on monitoring TÓTH Gábor DG EMPL – Impact Assessment, Evaluation Unit ESF Evaluation Partnership meeting, Rome, 26 November 2014.
Jolanta Urbanikowa University of Warsaw System of Language Provision – aspects of quality assurance.
Building composite indices – methodology and quality issues A. Saltelli European Commission, Joint Research Centre of Ispra, Italy
Simon Deakin CBR, University of Cambridge
STAGE-DEV Model Dr. Hasan Dudu European Commssion – Joint Research Center.
Session 9 – Data exploitation and publication
Open Access and Knowledge Production: ‘Leximetric’ Data Coding
The interim evaluation of Horizon 2020 – the way forward
Developing Regulatory Impact Assessment In Azerbaijan
Chair, Quality Assurance Committee
Lifelong Learning policies and the Open Method of Cooperation
Béatrice d’Hombres CRIE, DG-JRC ESF Evaluation Partnership Meeting
Cohesion Policy and Cities
Preparations for post-2020 Impact Assessment European Commission Directorate General for Regional and Urban Policy Unit DGA Policy.
Using RHOMOLO model to assess ESF macroeconomic impacts
The Estonian experience with ex-ante evaluation – set-up and progress
Data collection and validation support for the management of the ESF
Main recommendations & conclusions (1)
Presentation transcript:

Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability and Evaluation: Lessons from Around the World Andrea Saltelli European Commission, Joint Research Centre

212 May 2015 CRELL Centre for research on lifelong learning based on indicators and benchmarks DG Education and Culture+Joint Research Centre, since

312 May 2015 CRELL -Trajectories to achieving EU 2020 objs. -Employability and other benchmarks (mobility, multi-linguism) -Labour market outcomes Focus Foci of econometric research at the JRC, Ispra

412 May Counter Factual analysis and other Impact Assessment methodologies Regional Studies (com…petitiveness, innovation, well being) Composite indicators and social choice Focus Foci of econometric research at the JRC, Ispra

Indicators

Context: Knowledge in support to policy; evaluation and impact assessment, but also advocacy Caveat: Validity = plausibility, defensibility … and not ‘proof of truth’

When testing the evidence some reasonable people (and guidelines) suggest that ‘sensitivity analysis would help’. JRC fostered sensitivity analysis development and uptake (20 years of papers, schools and books). Today we call it sensitivity auditing and teach it within the syllabus for impact assessment run by the SEC GEN. … Sensitivity analysis

How to shake coupled stairs How coupled stairs are shaken in most of available literature Sensitivity analysis

Testing (composite) indicators: two approaches Michaela Saisana, Andrea. Saltelli, and Stefano Tarantola (2005). Uncertainty and sensitivity analysis techniques as tools for the quality assessment of composite indicators. J. R. Statist. Soc. A 168(2), 307–323. Paolo Paruolo, Michaela Saisana, Andrea SaltelliRatings and rankings: Voodoo or Science?, J. R. Statist. Soc. A, 176 (2), 1-26 Sensitivity analysis

1012 May 2015 First: The invasive approach Michaela Saisana, Béatrice d’Hombres, Andrea Saltelli, Rickety numbers: Volatility of university rankings and policy implications Research Policy (2011), 40, Sensitivity analysis

1112 May 2015 ROBUSTNESS ANALYSIS OF SJTU AND THES

1212 May 2015 SJTU: SIMULATED RANKS – TOP20  Harvard, Stanford, Berkley, Cambridge, MIT: top 5 in more than 75% of our simulations.  Univ California SF: original rank 18 th but could be ranked anywhere between the 6 th and 100 th position  Impact of assumptions: much stronger for the middle ranked universities

1312 May 2015 THES: SIMULATED RANKS – TOP 20  Impact of uncertainties on the university ranks is even more apparent.  M.I.T.: ranked 9th, but confirmed only in 13% of simulations (plausible range [4, 35])  Very high volatility also for universities ranked 10 th -20th position, e.g., Duke Univ, John Hopkins Univ, Cornell Univ.

1412 May 2015 Second: The non-invasive approach Comparing the weights as assigned by developers with ‘effective weights’ derived from sensitivity analysis. Sensitivity analysis

1512 May 2015 University Rankings Comparing the internal coherence of ARWU versus THES by testing the weights declared by developers with ‘effective’ importance measures.

Partnerships with OECD, WEF, INSEAD, WIPO, UN-IFAD, FAO, Transparency International, World Justice Project, Harvard, Yale, Columbia … Sixty analyses (Michaela Saisana, JRC) JRC fosters the development of good practices for the construction of aggregated statistical measures (indices, composite indicators).

Something worth advocating for (1): More use of social choice theory methods both for building meaningful aggregated indicators … (A pity that methods already available between the end of the XIII and the XV century are neglected by most developers) … they could be used more also in comparing options in the context of impact assessment studies.  course at JRC Ispra October 11-12

Sensitivity analysis Composit e indicators Impact Assessme nt Econometrics and Applied Statistics Unit Sensitivity Analysis: Sensitivity Auditing: Presentations/Saltelli-final-February-1-1.pdf Presentations/Saltelli-final-February-1-1.pdf Quality of composite indicators: Useful links: