Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Performance Assessment
Scottish Learning and Teaching Strategies Support Group Academy Scotland - Enhancement and Engagement 24 May 2007.
School Based Assessment and Reporting Unit Curriculum Directorate
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Evaluation.
The role of the mathematics subject leader in leading sustainable improvements Thursday 23 rd September 2010 Jo Lakey School Improvement Officer.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
The Academic Assessment Process
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Standards and Guidelines for Quality Assurance in the European
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
Project Evaluation: Measuring Learning Impact Diane Salter, Vice Provost Teaching and Learning.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Skunk Works Evaluation Tools: How do we know if we are having an impact?
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
Intel ® Teach to the Future Pre Service Evaluation across Asia - Gaining the Country Perspective - Deakin University Faculty of Education Consultancy and.
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
Why Do State and Federal Programs Require a Needs Assessment?
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Chapter 4 Developing and Sustaining a Knowledge Culture
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Office of Service Quality
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Evaluating the Quality and Impact of Community Benefit Programs
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Elizabeth Godfrey 1

 Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended effects  Identifies what worked and what didn’t  Provides level of judgement about the overall worth of intervention  Key to project improvement  Influence decision-making or policy formulation through the provision of empirically-driven feedback  sustainability

 Why is the evaluation being done?  How will the information be used?  What evaluation form and approach will be most suitable for this study? Formative – monitoring progress to improve approaches Summative – overall perspective – focus on value or worth of project and designed for accountability or continuation purposes

4

 Evaluation and assessment have many meanings…one definition: Assessment - is gathering evidence Evaluation - is interpreting data and making value judgements  Examples of assessment and evaluation Individual’s performance (grading) Program’s effectiveness (accreditation) Project’s progress and success (monitoring and validating)

 Effective evaluation starts with carefully defined project goals and expected outcomes  Goals provide overarching statements of project intention What is your overall ambition? What do you hope to achieve?  Expected outcomes identify specific observable results for each goal How will achieving your “intention” reflect changes in student behavior? How will it change their learning and their attitudes?

 Goals → Expected outcomes → Evaluation questions  Questions form the basis of the evaluation process  Evaluation process collects and interprets data to answer evaluation questions

 To what extent has the project been implemented as planned?  How well has the project been co-ordinated across different institutions?  To what extent have the intended outcomes been achieved?  Were there any unintended outcomes?  How well have the needs of staff been met?  To what extent have the intended student learning outcomes been achieved?  What measures if any have been put in place to promote sustainability of the project’s focus and outcomes?  Are students better able to describe the effects of changing some feature in a simple problem as a result of the intervention?  What lessons have been learned from this project and how might these be of assistance to other institutions?

What type of data is most appropriate? What are the most appropriate methods of data collection? How will data be analysed and presented? What ethical issues are involved in the evaluation?

 Surveys Forced choice or open-ended responses  Observations Actually monitor and evaluate behavior  Interviews Structured (fixed questions) or in-depth (free flowing)  Concept Inventories Multiple-choice questions to measure conceptual understanding  Rubrics for analyzing student products Guides for scoring student reports, test, etc.  Focus groups Similar to interviews but with group interaction

An Intelligent Tutoring System for Engineering Mechanics Tanja Mitrovic, Charles Fleischmann, Brent Martin, Pramudi Suraweera The planned evaluation will focus on the following questions:  Is the developed ITS effective?  Does it support learning better than the traditional approach?  Does the system increase students � motivation? We will compare performances of two groups of students. One group will learn the material in the traditional way, via lectures and tutorials. The other group will attend lectures, but the tutorials would be replaced by interaction with the system. Time will be controlled. We will require students to sit pre- and post-tests, to measure their knowledge. We will also collect data about their actions while solving problems, and analyze the data.

 Judgements will be required for each key evaluation question  Need to develop indicators or targets - standards and levels regarded as acceptable  Could use benchmarks – location or source of best practice for comparison

Who? key question underpinning the budget  Should it be an individual or a team?  Insiders or outsiders?  Skills – qualitative and/or quantitative  Prepare an evaluation brief or terms of reference  When to involve the evaluator?

 NSF’s User Friendly Handbook for Project Evaluation  Australian Learning and Teaching Council Grants Scheme – Evaluating Projects  Student Assessment of Their Learning Gains (SALG)

 Think about evaluation, purpose and scope from the beginning of the project  Ensure appropriate data is collected eg pre/post intervention  Evaluation provides supporting evidence to influence decision-making or policy formulation through the provision of empirically-driven feedback  sustainability

Elizabeth Godfrey Engineering Education Research and Project Management