W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Internal Assessment Your overall IB mark (the one sent to universities after the IB test) in any IB science course is based upon two kinds of assessments.
Managing SEN: Monitoring and Evaluation – Gathering Evidence: Making Judgements Day 1.
Introduction to Monitoring and Evaluation
What “Counts” as Evidence of Student Learning in Program Assessment?
Donald T. Simeon Caribbean Health Research Council
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.
Clinical Supervision Foundations Module Six Performance Evaluation.
An Assessment Primer Fall 2007 Click here to begin.
Research Design and Validity Threats
Strategic Management Process Lecture 2 COMT 492/592.
Educational Research by John W. Creswell. Copyright © 2002 by Pearson Education. All rights reserved. Slide 1 Chapter 11 Experimental and Quasi-experimental.
Title I Needs Assessment and Program Evaluation
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Study announcement if you are interested!. Questions  Is there one type of mixed design that is more common than the other types?  Even though there.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Evaluating NSF Programs
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Program Evaluation Using qualitative & qualitative methods.
The Evaluation Plan.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
AU- MAEL Dr. Dan Bertrand
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Quantitative Research Qualitative Research? A type of educational research in which the researcher decides what to study. A type of educational research.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Market Research Lesson 6. Objectives Outline the five major steps in the market research process Describe how surveys can be used to learn about customer.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
1 MODEL ACADEMIC CURRICULUM MODULE 13 Assessing and Evaluating Responses.
TAH Project Evaluation Data Collection Sun Associates.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
Organizational Design, Diagnosis, and Development Session 15 The Evaluation of Change.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Quantitative and Qualitative Approaches
Hafa Adai Student Learning and Assessment Welcome Dr. Julie M. Ulloa-Heath.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Program Evaluation.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
The Development and Validation of the Evaluation Involvement Scale for Use in Multi-site Evaluations Stacie A. ToalUniversity of Minnesota Why Validate.
Quasi Experimental and single case experimental designs
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
The P Process Eight Key Questions 1. What is the communication problem? 2. What do we need to do? 3. What materials/interventions do we need to develop?
Key actions:  No national regulator, but AITSL to drive improvement  Provisional and full accreditation  Clear and explicit instructions for providing.
The purpose of evaluation is not to prove, but to improve.
Entry-Level Health Educator.  Great differences existed in professional preparation programs  1978 Initial Bethesda Conference  Development of a.
Are we there yet? Evaluating your graduation SiMR.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Instructional Leadership Supporting Common Assessments.
Fact Finding (Capturing Requirements) Systems Development.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
EVALUATING A MIDDLE SCHOOL MATH M.ED. PROFESSIONAL DEVELOPMENT PROGRAM. Knowledge, Pedagogy, Practice or Student Achievement:
Presentation transcript:

W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland

W W W. C E S. C L E M S O N. E D U / G E / Workshop Objective Describe steps in the planning and assessment of engineering education research Give examples of each step Small-group discussion to implement the step using a common example

W W W. C E S. C L E M S O N. E D U / G E / Workshop Agenda 4:00-4:05Introduction 4:05-4:20Defining the Purpose of the Evaluation (What questions are we asking?) 4:20-4:40Clarify Project Objectives (What do we expect?) 4:40-5:00Create a Model of Change (How will our efforts lead to the objectives?) 5:00-5:15Select Criteria and Indicators (What data do we need to measure our progress?) 5:15-5:25Identify Data Sources (Where will we get that data?) 5:25-5:45Design Evaluation Research (How will we analyze that data?) 5:45-5:50Monitor and Evaluate (What actually happened?) Use and Report Results (Who needs to know?) 5:50-6:00Wrap-up and evaluations

W W W. C E S. C L E M S O N. E D U / G E / Step 1: Purpose of the Evaluation Formative Summative Efficacy Effectiveness Cost Examples Is there a benefit to doing this? Can we improve it? Does that benefit lead to better retention / grades? Do the benefits justify the program cost? Can we achieve the same benefits on a larger scale?

W W W. C E S. C L E M S O N. E D U / G E / Step 1 Action Steps Decide purpose(s) Primary questions? Order? Primary audience(s)? How will the results be used?

W W W. C E S. C L E M S O N. E D U / G E / Step 1: Purpose of the Evaluation Do we have a learning community? Is it leading to better course retention / grades? Can we achieve the same benefits cost- effectively on a larger scale?

W W W. C E S. C L E M S O N. E D U / G E / Step 2: Project Objectives Impact Outcome Process Example Improve graduation rate by some % Increase course passing rate by some % Student or faculty opinions / behaviors change Reserve any facilities needed

W W W. C E S. C L E M S O N. E D U / G E / Step 2 Action Steps Write down project objectives Impact Outcome Process

W W W. C E S. C L E M S O N. E D U / G E / Step 2: Project Objectives Increase graduation rate from 50% to 75% Increase first-time Calculus passing rate from 50% to 70% Students have positive opinion of group work Secure dormitory and classroom space

W W W. C E S. C L E M S O N. E D U / G E / Step 3: Create a Model of Change Identify assumptions you can assess Choose relationships to test based on Resources Where you anticipate problems Where you have control / can make improvements Link what you do to what you expect to happen Example Common residence / classes  in between, we need a sound theory of why this will happen  more will graduate

W W W. C E S. C L E M S O N. E D U / G E / Step 3 Action Steps Model of change as specific / complete as needed Review model assumptions Use criteria to prioritize Resources Relevance Control

W W W. C E S. C L E M S O N. E D U / G E / Step 3: Create a Model of Change Common residence / classes  affiliation Affiliation  support Support  performance Performance  future performance Future performance  more will graduate

W W W. C E S. C L E M S O N. E D U / G E / Step 4: Criteria and Indicators Validity Reliability Sensitivity Ease of interpretation Usefulness Example: define Graduation rate Course pass rate Affiliation Life-long learning Enrollment

W W W. C E S. C L E M S O N. E D U / G E / Step 4 Action Steps Define a set of indicators and criteria Impact Outcome Process

W W W. C E S. C L E M S O N. E D U / G E / Step 4: Criteria and Indicators Graduation rate # graduated / # in original cohort Calculus pass rate # A, B, C / # in original cohort D’s are no good – must be retaken Affiliation # in study group at end of sophomore year / # students Number of students enrolled in program

W W W. C E S. C L E M S O N. E D U / G E / Step 5: Data Sources Exams Surveys Observations Student records SAT scores Frequency Resources Guidance Change Example Institutional Research office (annual) Course records (each semester) Registrar Admissions Assessment office Process - monitor until achieved

W W W. C E S. C L E M S O N. E D U / G E / Step 5 Action Steps Define data sources Define frequency of measurement

W W W. C E S. C L E M S O N. E D U / G E / Step 5: Data Sources Grad / retention rate – Inst. Res. – Annual Calculus performance – Math dept. – includes course grades and common exam results –end of semester Dorm space allocated – Housing – monitor until achieved Survey of program participants Observer evaluations of class interaction SAT scores—Admissions

W W W. C E S. C L E M S O N. E D U / G E / Step 6: Design Evaluation Research Selection Mortality Placebo is not an issue Qualitative Interviews Focus groups Systematic observation Quantitative Non-experimental -Posttest only -Pretest-Posttest Quasi-experimental -Time Series -Nonequivalent control Experimental -Pretest-Posttest Control -Multiple Intervention

W W W. C E S. C L E M S O N. E D U / G E / Step 6 Action Steps Design evaluation research studies for key questions

W W W. C E S. C L E M S O N. E D U / G E / Step 6: Example Designs Example-Interviews Student expectations— invitation process modified Resource utilization— approaches to motivate attendance Example-Efficacy Non-equivalent control group Test for selection bias using baseline measures Matched pairs / groups Calculus grades and overall GPR

W W W. C E S. C L E M S O N. E D U / G E / Step 7: Monitor and Evaluate Establish project information system Budget for evaluation Evaluation meetings Review and revise evaluation plan Carry out studies

W W W. C E S. C L E M S O N. E D U / G E / Step 8: Use and Report Results Report results – to everyone Use results to make improvements

W W W. C E S. C L E M S O N. E D U / G E / Conclusions Planning and assessment is essential Start small if resources are limited Develop a plan before starting NSF and other agencies support well- designed educational research CCLI—EMD / A&I, ASA, and other programs Seek appropriate partners from education, psychology, sociology, statistics, etc.