1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.

Slides:



Advertisements
Similar presentations
Leadership Workshop September 4, 2009
Advertisements

Post 16 Citizenship Liz Craft Valuing progress Celebrating achievement.
ASSESSMENT 101 Preparing Future Faculty (PFF) Workshop Spring 2011 Facilitators: Dr. Kara Penfield, Director of Academic Assessment; Isis Artze-Vega, PFF.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
© 2004 Michigan State University PROM/SE: Promoting Rigorous Outcomes in Math and Science Education Overview, Fall 2004.
Dr. Pratibha Gupta Associate professor Deptt. of Community Medicine ELMC & H, Lucknow.
1 How LRW Faculty can Contribute to Their Law School’s Assessment Plan David Thomson (University of Denver) Sophie Sparrow (University of New Hampshire)
ABC Curriculum Showcase Coventry Public Schools February 10, 2009 An introduction for parents and community members Left click your mouse to advance to.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Introduction to Standard 2: Partnering with consumers Advice Centre Network Meeting Nicola Dunbar October 2012.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Adapted from Growing Success (Ontario Schools) by K. Gibson
ZUZANA STRAKOVÁ IAA FF PU Pre-service Trainees´ Conception of Themselves Based on the EPOSTL Criteria: a Case Study.
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Instruction aligned to Iowa Core: What does it look like? #CCSS.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Illinois MSP Program Goals  To increase the content expertise of mathematics and science teachers; 4 To increase teaching skills through access to the.
PARENT COORDINATOR INFORMATION SESSION PARENT ACCOUNTABILITY Wednesday, July 20, 2011 Madelene Chan, Supt. D24 Danielle DiMango, Supt. D25.
If you don’t know where you’re going, any road will take you there.
Communication System Coherent Instructional Program Academic Behavior Support System Strategic FocusBuilding Capacity.
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Connecting the Dots PLC AfL DI Higher Order Thinking TLCP Multi- Literacies Arts Technology Inquiry BIP SEF SIP.
PARTNERSHIP FOR STUDENT SUCCESS AT SANTA BARBARA CITY COLLEGE Overview and Two Models.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
ADEPT 1 SAFE-T Evidence. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
ESL Program Retreat Marconi 2009 “Moving Forward Together” ESL Program Retreat Marconi Conference Center, Marin February 27-28, 2009 “Moving Forward Together”
An Update on: AVATAR Advisory Committee Education Service Center Region 10 October 9, 2012 M. Jean Keller University.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
12 Tips for Mentoring Excellence Adapted by Dr. Reynaldo Ramirez, Jr.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
LSAC Academic Assistance Training Workshop June 13 – 16, 2012 OUTCOMES ASSESSMENT – THE BASICS Janet W. Fisher Suffolk University Law School.
How does the definition match your thoughts? How does the definition differ from your thoughts? Assessment Definition Assessment is the ongoing process.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Administrative and Educational Support Outcomes: Reporting Results, Taking Action, and Improving Services Lisa Garza Director, University Planning and.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Scrutiny – “the recipe for success” What are the ingredients for a successful scrutiny review?
The Ferl Practitioners’ Programme Transforming Teaching and Learning with ILT O4.3 ILT Implementation.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Impact of Instructional Strategies
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
Writing A Grant—From Start To Finish Workshop 4: Three (Not So) Little Words: Document, Collaborate, Evaluate Educational Resource Development LCC Foundation.
Anastasia Muiti, NEMA Monitoring of adopt a river project.
University of Minnesota Minnesota Department of Human Services Minnesota Positive Behavior Support Initiative.
© 2012, Community Training and Assistance Center © 2012, Teaching Learning Solutions Linking ISLLC and your Principal Rubrics to a Case.
+ Washington State Mathematics Fellows Andrew Hickman NCESD Math Fellows.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Instructional Leadership Supporting Common Assessments.
A New Trend Line in Student Achievement “Virginia's public schools are beginning a new trend line with the implementation of more challenging standards.
Deep dive on learning progressions + Critiquing lesson methods
The Wicked Problem of Measuring the Impact of Teacher Preparation: Increasing Rigor in Documenting Preparation Practices Larry Maheady, Buffalo State University.
Advanced Program Learning Assessment
How Do We Improve the Learning and Outcomes of our Students?
OGB Partner Advocacy Workshop 18th & 19th March 2010
Student Learning Outcomes Assessment
Curriculum and Data: Making Every Course Count
Presentation transcript:

1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009

2 Overview Importance of Evaluation Importance of Evaluation Balance between Reality and Rigor Balance between Reality and Rigor Direct and Indirect Measures Direct and Indirect Measures Relationship between Faculty and Researchers Relationship between Faculty and Researchers How Researchers Can Help Faculty How Researchers Can Help Faculty What Researchers Need from Faculty What Researchers Need from Faculty Lessons Learned Lessons Learned

3 Importance of Evaluation Demonstrates whether program/project is having desired impact on students Demonstrates whether program/project is having desired impact on students Identifies what is working and what needs improvement Identifies what is working and what needs improvement Measures effect of any changes made within program/project Measures effect of any changes made within program/project Enables ongoing internal and external sharing/reporting of evaluative results Enables ongoing internal and external sharing/reporting of evaluative results Helps justify continued support and funding Helps justify continued support and funding

4 Balance between Reality and Rigor First identify data already being collected First identify data already being collected Data collection should not place an undue burden on the program/project Data collection should not place an undue burden on the program/project Use direct measures whenever possible and reasonable Use direct measures whenever possible and reasonable Need to ensure that data being collected are actually measuring what you intended to assess Need to ensure that data being collected are actually measuring what you intended to assess Requires conversation between program/project leaders and researcher to achieve a suitable balance Requires conversation between program/project leaders and researcher to achieve a suitable balance

5 Direct and Indirect Measures Direct Measures Provide evidence of cognitive (knowledge) or behavioral (skills) learning that directly corresponds to specific intended learning outcomes Provide evidence of cognitive (knowledge) or behavioral (skills) learning that directly corresponds to specific intended learning outcomes Examples: exams, papers, grades, portfolios Examples: exams, papers, grades, portfolios Indirect Measures Assess whether learning has been meaningful by gathering information related to perceptions, opinions, experiences, and achievements Assess whether learning has been meaningful by gathering information related to perceptions, opinions, experiences, and achievements Examples: surveys, journals, graduation rates Examples: surveys, journals, graduation rates

6 Direct and Indirect Measures DirectIndirect Measures actual learning YesNo Gathering information More challenging More accessible Objectivity More objective More subjective Reflects factual information More facts More beliefs Insight into learning experience Less likely More likely

7 Relationship between Faculty and Researchers Researchers need to understand the program/project Researchers need to understand the program/project Project/program leaders need to understand demands on researcher Project/program leaders need to understand demands on researcher Develop Evaluation Plan together Develop Evaluation Plan together Researcher is seen as a member of the team Researcher is seen as a member of the team Ongoing relationship is key to ongoing success of the evaluation Ongoing relationship is key to ongoing success of the evaluation

8 How Researchers Can Help Faculty Provide options for assessment methods Provide options for assessment methods Share knowledge of data already available Share knowledge of data already available Facilitate accurate data interpretation Facilitate accurate data interpretation Listen Listen NOT dictate data to be used NOT dictate data to be used NOT advocate changing program/project to fit data NOT advocate changing program/project to fit data

9 What Researchers Need from Faculty To be invited and included in the conversation early To be invited and included in the conversation early Share details of the project/program (e.g., goals, data, interventions, intended outcomes) Share details of the project/program (e.g., goals, data, interventions, intended outcomes) Consider options provided by researcher and provide constructive feedback Consider options provided by researcher and provide constructive feedback To be kept involved and informed about ongoing progress and changes in the program/project To be kept involved and informed about ongoing progress and changes in the program/project Work with researcher to make the data meaningful and useful Work with researcher to make the data meaningful and useful Patience and understanding Patience and understanding

10 Exercise Common types of programs/projects: Learning Communities Learning Communities Tutoring Tutoring Supplemental Instruction Supplemental Instruction First-Year Experience First-Year Experience

11 Exercise Preparing to Meet with Researcher Have answers to the following questions: 1. Goal(s) – What effect do you intend the program/project to have? 2. Outcomes – What tangible results do you expect to see from students? 3. Intervention – What treatment(s) students will receive? Where and when are the points of contact with students? 4. Data – What data will demonstrate intended outcomes? What data are the program/project already collecting or planning to collect?

12 Exercise Preparing to Meet with Researcher Examples: 1. Goal(s) – Improve students’ success in basic skills courses through …? 2. Outcomes –Users have higher success rates than non-users; Increased rate of success to transfer over time 3. Intervention – one-on-one tutoring, phone calls during 1 st three weeks of term 4. Data – Course grades, test results, portfolio assessments, surveys, ?

13 Case Study SBCC’s Partnership for Student Success (What really happens when faculty and researchers get together)

14 Lessons Learned Start discussion with what you want to know about program/project Start discussion with what you want to know about program/project Decide on data collection BEFORE implementation, if possible Decide on data collection BEFORE implementation, if possible Be flexible, open and available Be flexible, open and available Be involved, invested and stay informed Be involved, invested and stay informed Work together as partners Work together as partners Data and research are your friends!!! Data and research are your friends!!!

15 Questions?

16 Thank You!!