DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Chapter 12 Understanding Work Teams
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 2.1 Chapter 2.
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
1 of 20 Evaluating an Information Project From Questions to Results © FAO 2005 IMARK Investing in Information for Development Evaluating an Information.
1 of 21 Information Strategy Developing an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy Developing.
1 of 19 How to invest in Information for Development An Introduction IMARK How to invest in Information for Development An Introduction © FAO 2005.
Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Title Subtitle.
Training for Teachers and Specialists
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA September 2003.
0 - 0.
National Academy of Engineering of the National Academies 1 Phase II: Educating the 2020 Engineer Phase II: Adapting Engineering Education to the New Century...
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
Addition Facts
School Based Assessment and Reporting Unit Curriculum Directorate
APS Teacher Evaluation
Week 2 The Object-Oriented Approach to Requirements
Providing Effective Feedback
7 Developing Employees Human Resources Management and Supervision
The SCPS Professional Growth System
1 Learning Through Innovation Dissemination Meeting.
Management Plans: A Roadmap to Successful Implementation
1 Dr. Ashraf El-Farghly SECC. 2 Level 3 focus on the organization - Best practices are gathered across the organization. - Processes are tailored depending.
1 Performance Management Challenges and Opportunities Harry P. Hatry The Urban Institute Washington DC.
Survey Responses Challenges and Opportunities Matt Richey St. Olaf College.
Fact-finding Techniques Transparencies
A Roadmap to Successful Implementation Management Plans.
Negotiating With Influence & Persuasion
Customer Service.
Introduction to Homeless Management Information Systems (HMIS)
Improving Practitioner Assessment Participation Decisions for English Language Learners with Disabilities Laurene Christensen, Ph.D. Linda Goldstone, M.S.
Reform and Innovation in Higher Education
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Chapter 16 Organizational Culture
© S Haughton more than 3?
Developing and Implementing a Monitoring & Evaluation Plan
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
Hillary Arnold Higher Education Quality Council of Ontario Canadian Association of Graduate Studies November 5, 2013 Increases in federal financial support.
Online Rubric Assessment Tool for Marine Engineering Course
Introduction to Creating a Balanced Assessment System Presented by: Illinois State Board of Education.
Addition 1’s to 20.
25 seconds left…...
Week 1.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
We will resume in: 25 Minutes.
Chapter 12 Analyzing Semistructured Decision Support Systems Systems Analysis and Design Kendall and Kendall Fifth Edition.
05/19/04 1 A Lessons Learned Process Celebrate the Successes Learn From the Woes Natalie Scott, PMP Sr. Project Manager.
Learning Outcomes Participants will be able to analyze assessments
Supply Chain Performance Measurement
Ch. 13: Supply Chain Performance Measurement: Introduction
Supply Chain Performance Measurement
Conducting a Comprehensive Needs Assessment. Objectives Identify the components of a comprehensive needs assessment Classify the types of data collected.
Virginia Teacher Performance Evaluation System 0 August 2012.
Questioning Techniques
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Program Evaluation Principles and Applications PAS 2010.
Presentation transcript:

DATA TRACKING AND EVALUATION 1

Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have to keep this in mind as you design your project and evaluation activities – This might mean that you change you way of thinking about project evaluation 2

For example: REUs The goal of the NSF REU program is to increase the number and quality of STEM undergraduates who pursue advanced degrees in STEM – This is NOT the goal of STEP Your project activities should focus REUs on: students not fully committed to STEM undergraduate degrees or students who may leave STEM programs – NOT students who will pursue a STEM degree “no matter what” 3

Evaluation-REUs Should focus on demonstrating that you have convinced students to stay in STEM or attracted new students to STEM – Pre/post-surveys – Tracking (with a comparison cohort) – Focus groups 4

Another example—Calculus Reform The goal of many TUES projects would be to increase student learning in calculus The goal of STEP is that students stay in STEM because you have reformed calculus Evaluation for TUES might include changes in standardized test scores Evaluation for STEP would include changes in pass rates, changes in rates of students who take Calc II, persistence rates, etc. 5

Evaluation of STEP projects One of the most difficult challenges is to identify the “impact” from the STEP funding – Have to think about this creatively – Try to separate out the impact of the project from the impact of other things that may be going on at your institution 6

Data Tracking and Evaluation For the key activity described in the previous exercise, state an expected outcome for the activity and give an example of data to be tracked. Think, Share, Report 7

Data Tracking and Evaluation (cont’d) Outcomes – Importance of goals, outcomes, and questions in the evaluation process Cognitive and affective outcomes – For STEP, you want to address the impact of these outcomes on persistence – Types of evaluation tools Advantages, limitations, and appropriateness – Data interpretation issues Variability, alternate explanations 8

Data Tracking and Evaluation (cont’d) Project evaluation – Formative – monitoring progress to improve approach – Summative – characterizing final accomplishments 9

Data Tracking and Evaluation (cont’d) Effective evaluation starts with carefully defined project goals and expected outcomes. – Goals and expected outcomes related to: Project management – Initiating or completing an activity Student behavior – Modifying an attitude or a perception » In the case of STEP, this means persistence and graduation 10

Data Tracking and Evaluation (cont’d) Goals  Expected outcomes Expected outcomes  Evaluation questions Questions form the basis of the evaluation process. Evaluation process collects and interprets data to answer evaluation questions. 11

Data Tracking and Evaluation (cont’d) Write a question for the expected outcome from the previous exercise. – For example: Did the survey show a change in the students’ attitude about …? Think, Share, Report 12

Tools for Evaluating Student Outcomes  Surveys ◦ Forced choice or open-ended responses  Concept Inventories ◦ Multiple-choice questions to measure conceptual understanding  Rubrics for analyzing student work products ◦ Guides for scoring student reports, tests, etc.  Interviews ◦ Structured (fixed questions) or in-depth (free flowing)  Focus groups ◦ Like interviews but with group interaction  Observations ◦ Actually monitor and evaluate behavior Olds et al., JEE 94:13, 2005 User-Friendly Handbook for Project Evaluation, NSF, 2002 ( 13

Example - Interviews Use interviews to answer these questions:  What does program look and feel like?  What do stakeholders know about the project?  What are stakeholders’ and participants’ expectations?  What features are most salient?  What changes do participants perceive in themselves?  For STEP—ultimately did your activity lead to increased enrollment or persistence? User-Friendly Handbook for Project Evaluation, NSF, 2002 ( 14

Choosing a Tool Relevance and design of the tool Prior testing and validation of the tool Experience of others with the tool 15

Learn: Summary of Best Practices in STEP Data Tracking and Evaluation Key roles for institutional research and IT services Roles/responsibilities of external evaluators and/or evaluation expert(s) 16

Data Tracking and Evaluation (cont’d) Measuring the “STEP effect” – Tools and databases to support the evaluation process – Intermediate metrics and effects – Disaggregated data – Corrective action – Institutional impact 17

Data Tracking and Evaluation (cont’d) Recall that expected outcomes are typically related to: – Project management – Student behavior Types of data – About the project – About students – Of special interest to NSF 18

Data Tracking and Evaluation (cont’d) Types of data – About the project “STEP effect” All metrics may not be known in advance. Are the data relevant and meaningful? Do the data inform next steps? 19

Data Tracking and Evaluation (cont’d) Types of data (cont’d) – About students Definitions of metrics should be clear and consistent. Some may be problematic (e.g., “majors” at a community college). Some data are difficult to track (e.g., transfer student data may be incomplete, or FERPA rules may limit sharing across institutions). Surveys help to confirm other data and observations. 20

Data Tracking and Evaluation (cont’d) Types of data (cont’d) – Of special interest to NSF Talk to your NSF program officer. Highlights, stories, successes, and shortcomings “People change because of the story.” Karan Watson, “Can we accelerate the rate of change in engineering education,” Main Plenary, 2010 ASEE Annual Conference, June 21,

Data Tracking and Evaluation (cont’d) Your questions? 22