Davide Azzolini, FBK-IRVAPP Enrico Rettore, FBK-IRVAPP Antonio Schizzerotto, FBK-IRVAPP MENTEP Kick-Off Meeting Brussels, 13-14 April 2015.

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
The Blueprint Your SIP (School Improvement Plan) A living, breathing, document.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
FORMATIVE/INTERIM ASSESSMENTS OCM BOCES NETWORK TEAM.
A Performance-Based Evaluation Model for Rewarding Merit in Italian Schools Donatella Poliandri, INVALSI Paola Muzzioli, INVALSI Isabella Quadrelli, INVALSI.
National infoday Austria 26 June 2014 Programme on KA1 Introduction Part 1: ERASMUS+ KA1 action: the applicant Part 2: KA1 action: the course provider.
Quality evaluation and improvement for Internal Audit
SAMPLING Chapter 7. DESIGNING A SAMPLING STRATEGY The major interest in sampling has to do with the generalizability of a research study’s findings Sampling.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Equitable Services for Private School Students March, 2012 Consultation Process & Meeting Agenda’s Marcia Beckman, Director Elementary & Secondary Education.
Copyright © 2001 by The Psychological Corporation 1 The Academic Intervention Monitoring System (AIMS) A guidebook & questionnaires to facilitate selection.
+ Measuring Teaching Quality in the Online Classroom Ann H. Taylor Director, Dutton e-Education Institute College of Earth and Mineral Sciences.
ZUZANA STRAKOVÁ IAA FF PU Pre-service Trainees´ Conception of Themselves Based on the EPOSTL Criteria: a Case Study.
1 Studying Teacher Professional Development: Challenges and Possible Solutions Nicole Kersting Research Scientist LessonLab Research Institute, Santa Monica,
South African Education Portal
8 December 2011 Roxana Brandt DG Education and Culture, Unit B.3, European Commission Grundtvig In-Service Training (IST): Professional Development of.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
HW425 Health & Wellness Programming: Design and Administration Unit 1 Seminar: Needs Assessment The Big Picture Instructor Beth Edwards, Ph.D.
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
The Impact of the Maine Learning Technology Initiative on Teachers, Students, and Learning Maine’s Middle School 1-to-1 Laptop Program Dr. David L. Silvernail.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
The Unit of Academic Accreditation Coordinator of the Male Section: Dr. Adel Saker Al-Khasawnah EXT.....
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Overview of ICCS field trial National Research Coordinators Meeting Windsor, June 2008.
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Implementing Formative Assessment Online Professional Development What Principals Need to know.
McGraw-Hill/Irwin © 2004 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 1 Initiating a Survey.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
Dr. Lesley Farmer California State University Long Beach
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Deck 3 of 3.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Field Test of Counselor System July 2000 Alabama Professional Education Personnel Evaluation Program.
The background of the improvement of PISA results in Hungary Trends in Performance Since 2000 International Launch of PISA 2009 Report February 10 th,
Evaluation Plan New Jobs “How to Get New Jobs? Innovative Guidance and Counselling 2 nd Meeting Liverpool | 3 – 4 February L Research Institute Roula.
Potential Errors In Epidemiologic Studies Bias Dr. Sherine Shawky III.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
Joint European Commission – IMPEL Seminar on Environmental Inspections Wednesday 17 November 2010 Hotel Le Meredien, Brussels 1.
Summary of Local Seminars & Focus Groups 20/06/ Athens WP8 – TESTING II coordinated by IFI.
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
Realising the European Union Lisbon Goal The Copenhagen process and the Maaastricht Communiqué: Martina Ní Cheallaigh DG Education and Culture.
Incentives and stimuli: OECD TALIS (Teaching and Learning International Survey) Improving Quality in Education: OECD - MEXICO Joint Conference Mexico City,
The Case of Estonia New Code of Ethics for Officials Anneli Sihver, Ministry of Finance 2 October 2015.
Institutional and legal framework of the national statistical system: the national system of official statistics Management seminar on global assessment.
Student Learning Outcomes (Pharmacy) Susan S. S. Ho School of Pharmacy Faculty of Medicine The Chinese University of Hong Kong 9 September 2007.
Application procedure From theory to practice Dieter H. Henzler, Steinbeis-Transfercenter Cultural Resources Management, Berlin.
Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.
Kathy Corbiere Service Delivery and Performance Commission
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
Implementing an impact evaluation under constraints Emanuela Galasso (DECRG) Prem Learning Week May 2 nd, 2006.
ICT Implementation in Students team 3 rd December, 2012 Shastri Bhavan, New Delhi Points for discussion Education Draft Document.
Researching Technology in South Dakota Classrooms Dr. Debra Schwietert TIE Presentation April 2010 Research Findings.
Framework for Excellence What is the Framework? The Framework is the Governments National Assessment Framework for Education and Training Public.
Final Presentation, European Cooperative House Brussels, 16 Dec.2009 Training +45 “Teachers Facilitating Learning among Elders” WP5 Test and Evaluation.
NIH Change Management Program Change Management Program Overview March 8,
Citizenship and Human Rights Education Assessing progress with the help of the Council of Europe Charter.
Case: Taking an Existing Intelligence Operation to the Next Level Jens Thieme Head of Market and Competitive Intelligence.
Davide Azzolini, FBK-IRVAPP Enrico Rettore, FBK-IRVAPP Antonio Schizzerotto, FBK-IRVAPP MENTEP Kick-Off Meeting Brussels, April 2015.
Davide Azzolini, FBK-IRVAPP Enrico Rettore, FBK-IRVAPP Antonio Schizzerotto, FBK-IRVAPP MENTEP Kick-Off Meeting Brussels, April 2015.
MENTEP kick off meeting and bilateral discussions Brussels, 13 April 2015 MENTEP in a nutshell …from the partners’ point of view Patricia Wastiau European.
Using Surveys to Design and Evaluate Watershed Education and Outreach Day 5 Methodologies for Implementing Mailed Surveys Alternatives to Mailed Surveys.
Formulation of the Research Methods A. Selecting the Appropriate Design B. Selecting the Subjects C. Selecting Measurement Methods & Techniques D. Selecting.
Improved socio-economic services for a more social microfinance.
Ellinogermaniki Agogi Research and Development Department DigiSkills Network DigiSkills: Network for the enhancement of Digital competence skills.
Assessments for Monitoring and Improving the Quality of Education
Presentation transcript:

Davide Azzolini, FBK-IRVAPP Enrico Rettore, FBK-IRVAPP Antonio Schizzerotto, FBK-IRVAPP MENTEP Kick-Off Meeting Brussels, April 2015

The MENTEP evaluation: a quick overview Day 1

The evaluation question Does the Technology-Enhanced Teaching Self-Assessment Tool (TET-SAT) really improve teachers’ TET self-assessment capabilities and, ultimately, increase their TET competencies? NB: The aim is to evaluate the tool, NOT the teachers, nor the schools!

The basics of the MENTEP evaluation design Counterfactual approach: compare a group of teachers who use the TET-SAT to an equivalent group of teachers who do not use it. The two groups of teachers will be identified in a way that: The two groups will be truly comparable (no «apples vs oranges» comparison!) The “no one forced, no one denied” principle will be satisfied

a) Sampling A random sample of schools will be selected from the relevant population, then a random sample of teachers will be drawn in each school. Reference population: Only schools with adequate ICT equipment levels will be considered; education level to be decided. Target sample size: approximately 1,000 teachers per country over no less than 50 schools (Small countries, ad hoc solutions). Incentives for the participation into the experiment (e.g., a lottery for one teacher per country offered a trip to visit Brussels). Oversampling of schools and teachers will be performed in order to cope with possible refusals. Moreover, administrative information (whenever available) will be used to check whether school/teachers refusing to cooperate are systematically different from the others.

b) Benchmark and follow-up surveys Benchmark survey (within the first two months of school year 2016/2017). Before the intervention all sampled teachers complete an on-line benchmark survey to (a) assess their TET competencies and (b) collect a rich set of information on their educational and professional experience. Follow-up survey (By the end of school year 2016/2017). After the intervention, sampled teachers re-assess their TET competences. This is the outcome variable by which the intervention will be evaluated.

c) Encouragement letters and estimation method The sampled schools are randomly split into two halves: ‘encouraged’ and ‘not encouraged’. A random subgroups of teachers teaching in the ‘encouraged’ schools receive a set of encouragement letters explaining how to use the tool and why they should. All other teachers – both in the ‘encouraged’ and in the ‘not encouraged’ schools - receive no information. Since not all those receiving the letters will make use of the tool while some of the teachers not receiving the letter presumably will do, a rough comparison of the two groups of teachers does not identify the causal effect of the tool. The letter of encouragement is used as an instrumental variable to increase the probability of participation for those who receive it and allow to retrieve the estimate of the impact. We can assess the peer effect of the intervention by comparing teachers in the ‘encouraged’ schools not receiving the set of letters to those in the ‘not encouraged’ schools

First steps from here to June (…and afterwards) Day 2

1. Organizational model The active involvement of National Authorities is crucial to the success of MENTEP: To contribute to the building of the TET-SAT To get the complete list of schools and teachers To appoint National Coordinators that will be the direct contact with schools

2. TET-SAT surveys The surveys will be administered to all sampled teachers at the beginning and at the end of the school year 2016/2017 The items will be identified through a matrix of "content areas" (e.g. digital communication, digital pedagogy, digital production, e-safety, etc.) and "types of skills": (frequency of use, level of awareness, knowledge, etc.). This would possibly allow us to assess the effects on different content and skills dimensions. The list of the content areas shall be suggested by the NAs and the precise definition of the items by ICT/pedagogy experts. It might be advisable to hide the responses’ ranking to the teachers. This would reduce the risk of “social desirability bias” in teachers answers. This could also make the tool more attractive as the teachers will get at the end a comprehensive assessment of their "skills" without knowing it in advance.

3. Identifying the reference population Two main criteria: 1)Schools that are equipped with sufficient ICT level (to be defined). 2)Focus on the same grades across the 12 countries (according to a very preliminary scrutiny, grades 6 to 8 could be the solution).

4. The lists of schools NAs provide the list of all schools at the elected grades and satisfying the required ICT level (including the contacts, , of the schools). IRVAPP selects the random sample of schools. Within this sample, IRVAPP randomly assigns schools to “encouraged” and “non encouraged” groups IRVAPP/EUN send back the sample lists to the NAs. NAs contact the schools to obtain the teachers’ lists. A key prerequisite: availability of schools and teachers’ s (teachers with no might not be eligible…as they have no ICT skills, presumably) The role of National Coordinators will be crucial for establishing contacts and coordinating the MENTEP schools. NB: Next meeting (June) will be dedicated to all these «sampling» details!

Thank you for your attention Contact: