L OOKING FOR R ESULTS : P RINCIPLES FOR E VALUATING S TUDENT S UCCESS I NITIATIVES Presenter: Rick Voorhees.

Slides:



Advertisements
Similar presentations
P-20 Data Collaborative Grant University/College Work Group February 24, 2010.
Advertisements

Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation.
The Complexities of Grading: What’s fair? What’s worth grading? Kay M. Sagmiller Oregon State University Center for Teaching and Learning.
S TUDENT COHORT TRACKING CLINIC Presenter: Rick Voorhees.
Tri-County Technical College Quality Enhancement Plan.
Enhancing Critical Thinking Skills 2012 HBCU Library Alliance Leadership Institute Presented By: Violene Williams, MLIS Reference Librarian James Herbert.
Ray C. Rist The World Bank Washington, D.C.
SEM Planning Model.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Unit 10: Evaluating Training and Return on Investment 2009.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Logic Modeling for Success Dr Kathryn Wehrmann Illinois State University.
PROGRAM EVALUATION 2013 R&D, FEBRUARY 12, 2014 DBEEPEDML.
Standards and Guidelines for Quality Assurance in the European
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
A Framework For School Counseling Programs
The Role of Assessment in the EdD – The USC Approach.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
E VALUATION P ERSPECTIVES : L OGIC M ODELS, I MPACTS AND B UDGETING 2011 S YSTEMS S CIENCE G RANTSMANSHIP W ORKSHOP USDA N ATIONAL I NSTITUTE OF F OOD.
Achieving the Dream Dr. Jan Lyddon October What is Achieving the Dream?
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Achieving the Dream Status Report Mentor Visit February 5-6, 2009.
Educational Solutions for Workforce Development Unit 8 – How to Evaluate Aim To provide an overview of how to effectively evaluate learning programmes.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Logic Models and Theory of Change Models: Defining and Telling Apart
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Beyond logical frameworks to program impact pathways CIIFAD M&E Workshop 5 November 2011  135 Emerson Hall Sunny S. Kim, MPH PhD candidate Division of.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Using a Logic Model to Plan and Evaluate Your Technology Leadership Development Program Chad Green, Program Analyst Lynn McNally, Technology Resource Supervisor.
Regional Educational Laboratory at EDC relnei.org Logic Models to Support Program Design, Implementation and Evaluation Sheila Rodriguez Education Development.
PLCS & THE CONNECTION TO RESPONSE TO INTERVENTION Essentials for Administrators Sept. 27, 2012.
Review Characteristics This review protocol was prospectively registered with BEME (see flow diagram). Total number of participants involved in the included.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
Kirkpatrick’s Four-Level Model of Evaluation
restricted external Evaluating the vinspired 24/24 programme Ewan King, director OPM 30 September
From the graphic point of view, P. D. C. A
Program Evaluation Principles and Applications PAS 2010.
Organizational Effectiveness, Change, and Innovation Tamara Norris, Instructor SOWO 804 Organizational and Community Behavior School of Social Work University.
Impact of Instructional Strategies
Ken Gonzalez, University of San Diego and Mary A. Millikin, Tulsa Community College 89th Annual AACC Convention April 5, 2009 Focus Groups: Putting the.
The Importance of Professional Learning in Systems Reform AdvancED Performance Accreditation Bev Mortimer Concordia Superintendent
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Are we there yet? Evaluating your graduation SiMR.
Session 2: Developing a Comprehensive M&E Work Plan.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Evaluation Emma King.
Collaborative Projects
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Logic Models and Theory of Change Models: Defining and Telling Apart
Practical Considerations in Using Data
Presentation transcript:

L OOKING FOR R ESULTS : P RINCIPLES FOR E VALUATING S TUDENT S UCCESS I NITIATIVES Presenter: Rick Voorhees

Goals for Today’s Work Together You’ll be able to –Identify key questions for evaluating interventions –Distinguish between different types of evaluation –Make the link between evaluation and Continuous Quality Improvement –Identify four components of evaluation –Visualize how a logic model can be a powerful tool for understanding interventions

T HE BIG Q UESTION What types of learners change in what evident ways with which influences and resources?

C HANCES A RE... Everything else equal, what are the chances (probability) that –Males and females progress and graduate at the same rate? –Racial and ethnic groups progress and graduate at the same rate? –Financial aid recipients and non-recipients progress and graduate at the same rate? –Students referred to developmental education and those that aren’t progress and graduate at the same rate?

F ORMATIVE P ROGRAM E VALUATION Formative evaluation: (sometimes referred to as internal) is a method for judging the worth of a program while the program activities are forming (in progress). This part of the evaluation focuses on the process. Permits faculty, staff, and students to monitor how well goals are being met. The main purpose is to catch deficiencies so that the program can readjust while it is in progress Source: Performance, Learning, Leadership, & Knowledge (n.d.) Retrieved March 25, 2013 at

S UMMATIVE P ROGRAM E VALUATION Summative evaluation: (sometimes referred to as external) is a method of judging the worth of a program at the end of the program activities (summation). The focus is on the outcome. Note: All evaluations can be summative (i.e., have the potential to serve a summative function), but only some have the additional capability of serving formative functions (Scriven, 1967). Source: Performance, Learning, Leadership, & Knowledge (n.d.) Retrieved March 25, 2013 at

T HE B ASICS OF C ONTINUOUS Q UALITY I MPROVEMENT Graphic Source: ww.anzca.edu.au/fpm/resources/educational-documents/guidelines-on-continuous-quality-improvement.html A Act C Check D Do P Plan

F OUR C OMPONENTS OF A C ULTURE OF I NQUIRY Component One “What’s Wrong” Use disaggregated longitudinal cohort data to determine: 1)Which student groups are less successful than others (i.e. identify gaps in student success 2)Which high enrollment courses have the lowest success rates Component Two “Why” Collect, analyze, and use data from other sources (focus groups, surveys, literature reviews) to identify the underlying factors (barriers or challenges) impeding student success. Component Three “Intervention” Use data from Component Two to design new interventions, or revise current ones, to effectively address the underlying factors impeding student success. Review and consider changes to existing practices and policies that impact those factors Component Four “Evaluation and Modification” Collect, analyze, and use evaluation data to answer 1)To what extent did the intervention (including policy changes) effectively address underlying factors? 2)What extend did the interventions increase student success Source: K.P. Gonzalez. Using Data to Increase Student Success: A Focus on Diagnosis. Retrieved March 12, 2013 at

Group 1 Group 2 Intervention Post-Test Measurement Post-Test Measurement True Experimental Design Pretest-Posttest Control Group Design Source: Campbell, D.T. & Stanley, J. (1963). Experimental and Quasi-Experimental Designs for Research. Wadsworth Publishing Random Assignment True Differences

Group 1 Group 2 Intervention Post-Test Measurement Post-Test Measurement Quasi-Experimental Design Posttest Only Control Group Design Could be a “historical cohort” Source: Campbell, D.T. & Stanley, J. (1963). Experimental and Quasi-Experimental Designs for Research. Wadsworth Publishing True Differences

P ERCENTAGE OF S TUDENTS P ERSISTING B Y E NROLLMENT IN A S TUDENT S UCCESS C LASS Source: Voorhees & Lee. (n.d.) Basics of Longitudinal Cohort Analysis. Retrieved April 15, 2012 at /default/files/resources/ATD_Longitudinal-Cohort-Analysis.pdf

Marco Level Cohort Micro Level Cohort SSBTN Template Intervention Level

Kirkpatrick’s Four Level Evaluation Model Step 1: Reaction - How well did the learners like the learning process? Step 2: Learning - What did they learn? (the extent to which the learners gain knowledge and skills) Step 3: Behavior - What changes in performance resulted from the learning process? (capability to perform the newly learned skills while on the job) Step 4: Results - What are the tangible results of the learning process in reduced cost, improved quality, increased production, efficiency, etc.? Kirkpatrick, D. (1998). “Evaluating Training Programs: The Four Levels” (Second Edition). San Francisco: Berrett-Koehler Publishers, Inc.

C AUSE OR C ORRELATED ? Attribution of cause and effect is difficult when working with a highly complex institution with multiple programs and initiatives as well with students from a wide range of backgrounds and current environmental influences.

S OURCES OF E VALUATIVE D ATA Administrative data systems Focus groups Faculty journaling Student journaling External surveys Institutionally-developed surveys Interactions with college services Matching external databases

Census DateEnd-of-Term Most of What We Do Know About Students Happens Here Most of What We Don ’ t Know About Students Happens Here We Could Do So Very Much Better Here Academic Terms, Administrative Data Systems, and What We Know About Students The Academic Term

Resources and Inputs Activities OutputsOutcomes Planned Work Intended Results Assumptions L OGIC M ODELING : I NFORMING P LANNING AND E VALUATION Impact

L OGIC M ODEL E LEMENTS AssumptionsResourcesActivitiesOutputsOutcomesImpact The underlying assumptions that influence the program’s design, implementation, or goals Human, financial, and organization al resources needed to achieve the program’s objectives. Things the program does with the resources to meet its objectives. Direct products of the program’s activities: evidence that the program was actually implement ed. Changes in participants’ knowledge, behavior, skills, status, and level of functioning as a result of the program. Systemic, long-term change as a result of the program (as long as 7- years).

W HY U SE A L OGIC M ODEL ? 1. Program Design and Planning: serves as a planning tool to develop program strategy and enhance the ability to clearly explain and illustrate program concepts and approach to all college stakeholders 2. Program Implementation: forms the core for a focused management plan that helps identify and collect the data needed to monitor and improve programming 3. Program Evaluation and Strategic Reporting: presents program information and progress toward goals in ways that inform, advocate for a particular program approach, and teach program stakeholders

INPUTSOUTPUTS Program investments ActivitiesParticipationShortMedium What we invest What we do Who we reach What results Long- term Logic Models Aren’t Automatically Linear OUTCOMES

Assumptions Outcome Outputs Activities Inputs Your Results Your work Your beginnings Impact L OGIC M ODEL W ORKSHEET