Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Using Summer Bridge Programs to Enhance College Readiness, Improve Student Retention, and Increase Completion Rates Dr. Dennis G. Jones, Professor Dean,
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Service Learning through Community Inquiry: A Campus-Community Partnership Robin Ringstad Valerie Leyva John Garcia Kelvin Jasek-Rysdahl California State.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
An initiative of the Research & Planning Group for California Community Colleges Assessing Student Learning Outcomes Presented by Jan Connal, PhD March.
1 Institutional Research: Focus on Outcomes and Student Success IR&P Board Presentation Robert B. Barr, Ph.D. Executive Director October 6, 2003.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
An Assessment Primer Fall 2007 Click here to begin.
The Creation of Two Comparison Groups to Facilitate Evidence-Based Decision Making Prepared and Presented by Keith Wurtz Dean, Institutional Effectiveness,
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Standards and Guidelines for Quality Assurance in the European
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
David Gibbs and Teresa Morris College of San Mateo.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Student Services Assessment Workshop College of the Redwoods Angelina Hill & Cheryl Tucker Nov 28 th & 30 th, 2011.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
WEST VIRGINIA UNIVERSITY Institutional Research WEST VIRGINIA ADVENTURE ASSESSMENT Created by Jessica Michael & Vicky Morris-Dueer.
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: Students are not adequately.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
Quantitative and Qualitative Approaches
A Look at the Early Alert System A. Craig Dixon Madisonville Community College New Horizons Teaching.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Evaluating Educational Technology Brian McNurlen & Chris Migotsky University of Illinois at Urbana-Champaign.
Predicting Student Retention: Last Students in are Likely to be the First Students Out Jo Ann Hallawell, PhD November 19, th Annual Conference.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Dr. Dawn Person Chieh-hsing Chou (Jessie) Spring 2010.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
 California community colleges serve over 2.9 million students each year  70 to 80% of students enrolled in California community colleges need developmental.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Estimating Effects and Inferring Implications 2016 Academic Academy Sacramento, CA Terrence Willett Senior Consulting Researcher RP Group.
Los Angeles Valley College April 21, QUESTION 3: NEW GOALS & OBJECTIVES REFLECTING COLLEGE BASIC SKILLS INITIATIVE “ACTION PLANS”
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Using Data in a School Counseling Program Miss M. Brand Pine Grove Area Elementary School.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Designing Effective Evaluation Strategies for Outreach Programs
Evaluation of An Urban Natural Science Initiative
Taia L.C. Reid, Assistant Director of the Peer Educator Program
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Derek Herrmann & Ryan Smith University Assessment Services
AVID College Completion Project
The Heart of Student Success
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Student Learning Outcomes Assessment
Presentation transcript:

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution Date How to Evaluate a Basic Skills Program

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overview

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Outcomes 1. Describe the important elements of a program evaluation model, including different forms of evaluation and measurement 2. Identify and describe different methodologies and measures that can be used with common intervention strategies in basic skills programs

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Program Evaluation Evaluation is… “The systematic collection of information to make judgments, improve program effectiveness and/or generate knowledge to inform decisions about future programs.” (Patton, 1997)

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Importance of Evaluation Demonstrates whether program/project is having desired impact on students Identifies what is working and what needs improvement Measures effect of any changes made within program/project Enables ongoing internal and external sharing/reporting of evaluative results Helps justify continued support and funding

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Striking a Balance between Reality and Rigor First identify data already being collected Data collection should not place an undue burden on the program/project Use direct measures whenever possible and reasonable Need to ensure that data being collected are actually measuring what you intended to assess Requires conversation between program/project leaders and researcher to achieve a suitable balance

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overview of Common Research Designs

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Pre/Post-Test Research Design Best way to measure improvement over time OXO O = Observation (Pre-test) X = Treatment (Classroom Intervention) O =Observation (Post-test)

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Group Comparison Research Design Best way to compare treated and untreated groups Group 1OXO Group 2OO O = Observation (Pre-test) X = Treatment (Classroom Intervention) O = Observation (Post-test)

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Surveys Best way to measure students’ attitudes, beliefs, and/or perceptions Can be used to enhance quantitative data (helps get at the HOW to resolve a problem) Can be pre/post-test or post-test only Can be group comparisons

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Qualitative Methods Often asks the question of “how” and “why” instead of “what” Focuses on the details; more holistic Looks at the quality of relationships, activities, experiences, situations, or materials Types of methods Participant observation Direct observation Interviews Focus groups Case studies

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges How to Do Evaluation What Data to Collect and Analyze to Demonstrate Program Effectiveness

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges How to Do Evaluation Specific Intervention Combined Effect Overall Program Professional Development

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Questions What would you want students who receive the intervention to gain? How would you know they achieved these desired outcomes? How do you think you could measure these desired outcomes? What else do you want to know about students’ experience with tutoring and how can you obtain this information?

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Participation/Usage 1 Compare the number of students and/of interactions before and after increased resources in like semesters to demonstrate that the increased resources resulted in increased usage Example: Compare usage figures from Fall 2008 to Fall 2009 and Spring 2009 to Spring 2010

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Participation/Usage 2 Track students participation in various activities to demonstrate what proportion of freshmen are participating Example: Examine participation numbers to identify high and low participation activities, determine in what combination students are participating; and track this information over time to determine if participation is changing

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Group Comparison 1 Compare success rates of students who received intervention to students who did not to demonstrate that intervention helped students achieve greater success Example: Compare success rates of students in Math 70 who received tutoring to students in the same sections who did not

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Group Comparison 2 Compare persistence rates of students who received intervention to students that did not to demonstrate that the intervention helped students persist at a higher rate Example: Compare persistence rates from Math 70 to Math 80 of students who received tutoring in Math 70 to those who did not

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Group Comparison 3 Compare success rates of students by the number of visits to examine whether there is a relationship between the number of visits and course success Example: Among students who received tutoring for Math 70, compare the success rates of those students who had 1, 2-3, 4-6, 7-10, 11-15, 16-20, more than 20 visits

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Pre-/Post-Tests Assess students’ skills/knowledge, and/or abilities at the beginning and end their participation in longer-term activities to demonstrate that participation resulted in improvement Example: Give students math skills test at beginning and end of Summer Bridge Example: Have students self-assess their knowledge and skills related to college readiness at the beginning and end of the Counseling 50 course

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Formative Evaluation 1 Surveys, Interviews, and Focus Groups Survey students at the conclusion of the interaction to assess their satisfaction and their perceptions of how helpful the intervention was Survey/Interview/Focus groups with students at the end of their first and possibly second semesters to assess the longer-term impact of the intervention on their experience Include questions on institutional surveys to assess overall usage and satisfaction with the intervention

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Specific Intervention Formative Evaluation 2 Surveys, Interviews, and Focus Groups Survey mentors/mentees or tutors/tutees to assess their experience with peer-based interventions Interview/Focus groups with faculty/staff who lead different programs to assess the effectiveness of the intervention from their perspective Survey/interview faculty at the end of the semester to assess their impressions of how helpful the intervention has been to the success of students in their classes

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Combined Effect Question What do you want to know about students’ participation in multiple activities and how can you obtain this information? The Ideal Multiple regression analysis is the best method to examine and compare effects of multiple interventions because it allows for statistical isolation of variables related to the student and the interventions The Problem Probably need assistance from IR to conduct this analysis Alternative Analysis Less statistically-based method that can provide some insight into the effects of multiple interventions

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Combined Effect Group Comparison Identify a targeted student population Document and track students’ participation in various activities Examine differences in success rates based on students’ participation to determine the individual and combined impacts of the interventions on student success

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Example: Combined Effect In the population of freshman students in Fall 2010, document which students participated in Summer Bridge, Peer Mentoring, and/or Freshman Seminars. Divide these students into all the possible combinations of participation in the three activities: None Only bridge Only mentoring Only seminars Bridge and mentoring Bridge and seminars Mentoring and seminars All three

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Combined Effect Formative Evaluation Survey/focus groups with participants and non-participants to assess their reasons for participating or not and determine how the two groups differ or are the same

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overall Program Purposes and Uses These assessments are designed to provide an overall examination to help identify possible areas in need of further investigation There are a number of assessments that can be done for the entire population or subsets of the population such as specific disciplines, course sequences, or individual courses

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overall Program Trend Analysis Examine success rates in each term at the course, discipline or overall levels to track student performance over time Examine persistence rates through each level in the course sequence to the transfer level to track student performance over time

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overall Program Longitudinal Analysis Track students to see what percentage persist to earn an associate degree within six years Track students to see what percentage persist to transfer to a four-year institution within six years

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overall Program Student Learning Outcomes (SLO) Examine SLO assessment data to determine how well students are achieving course SLOs Examine students’ performance related to program- and/or institutional-level SLOs Note: Important to remember context here.

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Overall Program Formative Evaluation Survey students in the program to assess their participation in interventions, their reasons for participating or not, and their perceptions of the helpfulness of the interventions Survey faculty/staff in the program to assess their perception about the helpfulness of the interventions to their students

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Professional Development Questions What would you want faculty/staff who participate in professional development to gain? How would you know they achieved these desired outcomes? How do you think you could measure these desired outcomes? What else do you want to know about faculty/staff’s participation in professional development and how can you obtain this information?

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Professional Development Pre-/Post-Tests For individual instructors, compare success rates in sections of the same course prior to PD participation to demonstrate that the PD helped instructor improve student success Example: Compare success rates of Mr. Faculty’s English 90 sections that he taught in Fall 2008 and Spring 2009 before his participation in PD to the rates in his English 90 sections in Fall 2009 and Spring 2010 after his participation in PD Caution: While it addresses inter-instructor reliability, it does not account for other factors that can influence results because assessments occurring at different times

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Professional Development Formative Evaluation Survey faculty and staff at the conclusion of each individual activity to assess their satisfaction Survey/Focus Groups with faculty and staff at a time after their PD experience to determine the applicability of what they learned and any suggested improvements for the PD they may have Survey/Focus Groups with students in class about newly implemented strategies to assess their experience and satisfaction

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Importance of Isolation How do you know if the differences you found are a result of the intervention? Concept of isolation is one of the more important factors to consider when designing a study Important to isolate the effect of the intervention as much as possible What to consider when doing: Group comparisons Pre-/Post-test Surveys Trend Analysis

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Two Common Questions 1. How do we know if we are helping D/F students improve to earn a C or if students would have done the same without the interventions? Pre-/Post-tests and Group Administer pre- and post-tests in class to students who are receiving intervention and who are not and compare their improvement between the pre- and post-test. This method focuses on improvement instead of final grades. Example: Compare improvement scores between students in Math 70 who used the Tutoring Center and those who did not

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Two Common Questions 2. Are we simply attracting students to our programs who would succeed anyway? Group Comparisons Examine differences between participants and non-participants in: Demographics (e.g., age, gender, ethnicity, others?) Prior GPA Prior course success Placement level Survey/Focus Groups Survey/Focus groups with participants and non- participants to assess their reasons for participating or not and determine how the two groups differ and/or are comparable

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Reality Check Other Considerations Need to consider student inputs when evaluating results Need to consider context within which the program operates Consider potentially political nature of your data Consider the audience for your data

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges The BRIC Initiative BRIC: The RP Group: Contact: Rob Johnstone Project Director, Priyadarshini Chaplot Project Coordinator,