Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Improve your Club’s project design and implementation Presented by: Ulrike Neubert SIE PD Ann-Christine Soderlund SIE APD.
Deanne Gannaway Facilitating Change in Higher Education Practices.
Standards, data and assessment. Links to Tfel 1.6 Design, plan and organise for teaching and learning 2.4 Support and challenge students to achieve high.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Differentiated Supervision
Using the T-9 Net This resource describes how schools use the T-9 Net to monitor the literacy and numeracy skills of students in Transition, Year 1 and.
Student role in assessment Peter Chalk Academic Leader Undergraduate Centre TB2.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Instructional System Design
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Outcome Based Evaluation for Digital Library Projects and Services
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Mathematics and Statistics Leaders Symposium September 2011 Waipuna Conference Centre Overall Teacher Judgments and Moderation Christine Hardie.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
What is HQPD?. Ohio Standards for PD HQPD is a purposeful, structured and continuous process that occurs over time. HQPD is a purposeful, structured and.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Promising Ideas and Issues to Consider in Reaching Reading and Literacy Goals Logistics of supervision, training, support to teachers Sakil Malik Director.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
College of Science and Engineering Learning and Teaching Strategy Planning Meeting Initial Reflections Nick Hulton.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
NHS Education for Scotland Defining A Quality Improvement Framework For A Coordinated Service Model Workshop 27 th May 2003 Dr Ann Wales NHS Scotland Library.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
INTRODUCTION TO STUDY SKILLS. What are Study Skills?  Study skills are approaches applied to learning. They are considered essential for acquiring good.
External Review Exit Report Springfield Platteview Community Schools March 2-4, 2015.
NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE Report of Independent Evaluation Presentation – 7 th February 2012 NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
School Improvement Partnership Programme: Summary of interim findings March 2014.
SEEC SUMMER PD Day 2: Content Area Groups Wireless Access Username: wirelessguest Password: wireless seec.nefec.org.
Developing a Sustainable Procurement Policy and Strategy EAUC – EAF Programme.
Monitoring and Evaluation
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Primary.  There was a greater level of improvement in Literacy than Numeracy for both FSME and Non-FSME pupils.  Boys showed a greater level of.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which has received funding from the European Union within.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
A HANDBOOK FOR PROFESSIONAL LEARNING COMMUNITIES AT WORK CHAPTERS 1-3 Learning by Doing.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Training Trainers and Educators Unit 8 – How to Evaluate
MUHC Innovation Model.
Evaluation : goals and principles
NQT Mentor and Tutor Seminar
The Literacy Hub Introduction Literacy Toolkit
Training Trainers and Educators Unit 8 – How to Evaluate
WHAT is evaluation and WHY is it important?
Presentation transcript:

Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at

What is evaluation research? Periodic assessment of results –Appropriateness, effectiveness, efficiency, impact, sustainability Identifies intended and unintended effects Identifies what worked and what didn’t Provides level of judgement about overall worth of intervention

What has it got to teach us? Systematic approach to what we’re doing already Extracting lessons learned Legacy of sustained evaluation frameworks for ongoing data collection Arguments against ‘popularity contest’ course and program evaluation

Step 1: describe the program Aims and objectives –Need to be clearly articulated –Does everyone involved share the same aims Program logic diagram –Describes how we think the program produces results –May be called logframe

Program logic identifies what we need to evaluate

Evaluating PBL tutor training INPUTSACTIVITIESOUTPUTSOUTCOMESIMPACTS Training materials Staff time Staff support workshops Rooms/ space Library support [no Administrative support] Published research Compiling materials Putting Materials online Running workshop Staff research Review and edit materials Staff reflection Numbers of new tutors Numbers of Returning tutors Level of satisfaction Growing library of examples and materials Tutors facilitate rather than inform Tutors Understand PBL Tutors develop Knowledge independently Tutors make PBL - Appropriate contributions to Curriculum review Students take Responsibility for learning Students learn to focus research and Synthesise results

Monitoring facilitates data collection Can be process (outputs) or impact (short to medium outcome) monitoring Need to develop indicators of progress Targets may be included in indicators or separate e.g. –In % of 2 nd yrs will have used the new facility for more than 20 hrs –OR percentage of students using facility (indicator) –50% in 2010 (target)

PBL Tutor training GOAL: provide timely well-placed supportive guidance to encourage tutors to scaffold and facilitate student learning OBJECTIVES: at the end of training the successful tutor will be able to articulate a good understanding of the objectives and methods of PBL guide student learning through providing appropriate support and guidance rather than information contribute to curriculum development within a staff team

Monitoring Tutor Training FOCUSPERFORMANCE INDICATORS DATA SOURCES DATA COLLECTION METHODS RESPONSIBILITY FOR COLLECTION TIME FRAME Objective 1: tutors articulate a good understanding of the objectives and methods of PBL Outcomes Tutors facilitate rather than inform Tutor contributions to discussion Online Discussion lists Content analysis of lists Course co-ordinatorTwice per semester Outputs Level of satisfaction Likert: training has clarified purposes of PBL Exit surveyPaper survey post training TrainerEvery semester Objective 2: tutors guide student learning through providing appropriate support and guidance rather than information Outcomes Tutors facilitate rather than inform Students take responsibility for learning Tutor contributions to discussion Student contributions to discussion Online Discussion lists Online Discussion lists Content analysis of lists Course co-ordinator Twice per semester Twice per semester

Evaluation asks formative and summative questions Are the questions cohesive and logical? Do evaluation questions link to monitoring data? Have ethical issues been addressed? What mechanisms are in place to gather the learnings generated by evaluation? What needs to be retained from this evaluation process in future years?

Vary questions to suit program QUESTIONSSOURCES OF INFORMATION FROM MONITORING SOURCES OF INFORMATION FROM EVALUATION DATA COLLECTION DATA ANALYSIS AND REPORTING METHODS Is the program as implemented appropriate, and is it being appropriately monitored? Has the purpose of the program been achieved? Is the program being implemented in the most efficient and cost effective way possible? Has the program produced sustainable results and outcomes?

PBL Tutor Training Appropriateness Did the training model the target behaviour? Has the purpose of the training been achieved? Indicator 1: changes in Tutor behaviour indicating deeper knowledge of and commitment to PBL Indicator 2: changes in student behaviour Efficiency Was the time invested by staff good value? Sustainability what needs to retained as core material from year to year to retain benefit of training

Owen, J. (2006) "Program Evaluation" Allen & Unwin: Crows Nest Making use of evaluation

Dissemination and reporting Findings may be communicated throughout project to multiple audiences –Evidence, conclusions, judgements, recommendations. Different occasions call for different styles –Oral, interactive workshops, posters, reports, summaries, papers, conference presentations Must be well timed

Developing capacity “process use of evaluation” –Taking part develops skills –Taking part sensitises staff to issues Improved communication –Ongoing 360 degree dissemination –Development of local discourse