Evaluation Workshop 10/17/2014 Presenters: Jim Whittaker – KCP/Evaluation Dr. Larry M. Gant – SSW Evaluation Christiane Edwards – SSW Evaluation 1.

Slides:



Advertisements
Similar presentations
Results Based Monitoring (RBM)
Advertisements

Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Self-Study Tool for Alaska Schools Winter Conference January 14, 2010 Jon Paden, EED Deborah Davis, Education Northwest/Alaska Comprehensive Center.
What is Assessment? The university, with the support and encouragement of President Hitt and the Vice President team, engages in the institutional effectiveness.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
What type of support is available to meet our school-community integration goals and improve student outcomes?
From QA to QI: The Kentucky Journey. In the beginning, we were alone and compliance reigned.
EDD/581 Action Research Proposal
Campus Improvement Plans
New and Emerging GEAR UP Evaluators
Toolkit Series from the Office of Migrant Education Webinar: SDP Toolkit August 16, 2012.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Pillar 4a Information management
7 Accountability Getting clear about what you want to accomplish with technology How will you measure its use? How will you communicate its effects?
COLLEGE SPARK WASHINGTON 2012 Community Grants Program Application Webinar 12/22/201110:00 AM 1/4/20122:00 PM.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
1 Statewide Assessment Elements of a Successful Statewide Assessment.
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
About Waterloo website Project report June Outline Overview of process Project deliverables Lessons learned.
The Community Schools Evaluation Toolkit: Moving the Research Agenda Forward Reuben Jacobson, University of Maryland Shital C. Shah, Coalition for Community.
2014 AmeriCorps External Reviewer Training
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Orientation to the Health and Career Education 8 and 9 IRP 2005.
Quarterly Update Blue Ribbon Commission’s Recommendations of the 2009 Criterion Referenced Competency Test (CRCT) Erasure Investigation Dr. Beverly L.
CHAPTER 5 Infrastructure Components PART I. 2 ESGD5125 SEM II 2009/2010 Dr. Samy Abu Naser 2 Learning Objectives: To discuss: The need for SQA procedures.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Food and Nutrition Surveillance and Response in Emergencies Session 14 Data Presentation, Dissemination and Use.
National Professional Development Center on ASD Lisa Sullivan MIND Institute, UC Davis.
MONITORING INDISTAR® STATE-DETERMINED IMPROVEMENT PLANNING TOOL.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
Tools Parent Centers Can Use to Help Families with Secondary Transition Planning Catherine Fowler, National Secondary Transition Technical Assistance.
INSTRUCTIONAL EXCELLENCE INVENTORIES: A PROCESS OF MONITORING FOR CONTINUOUS IMPROVEMENT Dr. Maria Pitre-Martin Superintendent of Schools.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Creating Pathways for Education, Career and Life Success Webinar: Developing a Pathways Plan January 18, 2013 Facilitated by Jeff Fantine, Consultant.
Florida Education: The Next Generation DRAFT March 13, 2008 Version 1.0 Lesson Study Presented by: Darliny G. Katz, Instructional Reading Specialist Florida.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
1.  Mapping Terms  Security Documentation  Predictor Table  Data Discussion Worksheet 2.
Measuring for Success Module Nine. Reflecting on the Previous Session What was most useful? What progress have you made? Any comments or questions?
Focused Review of Improvement Indicators A Self-Assessment Process SPP Stakeholder Meeting December 16, 2009.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. Moving.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Diocese of Fort Worth Curriculum Development Process Professional Development Evaluation Report EDU: Dr. Ballenger Authors: Pamela Cooper, Charlene.
Student Learning Objectives 1 SCEE Summit Student Learning Objectives District Professional Development is the Key 2.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Instructional Leadership Planning with Indicators of Quality Instruction.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Gordon State College Office of Institutional Effectiveness Faculty Meeting August 5, 2015.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
AVID Leading College & Career Readiness Districtwide - Transforming Student Outcomes AVID's mission is to close the achievement gap by preparing all students.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Instructional Leadership Supporting Common Assessments.
The IEP: Progress Monitoring Process. Session Agenda Definition Rationale Prerequisites The Steps of Progress Monitoring 1.Data Collection –Unpack Existing.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
District Leadership Team Sustainability Susan Barrett Director, Mid-Atlantic PBIS Network Sheppard Pratt Health.
Multi-Sectoral Nutrition Action Planning Training Module
Continuous Improvement Plan (CIP)
TITLE Business Case YOUR LOGO BUSINESS CASE PRESENTATION 00/00/0000
Using the Child and Family Outcomes Analysis Tools
Dr. Phyllis Underwood REL Southeast
Institutional Self Evaluation Report Team Training
Presentation transcript:

Evaluation Workshop 10/17/2014 Presenters: Jim Whittaker – KCP/Evaluation Dr. Larry M. Gant – SSW Evaluation Christiane Edwards – SSW Evaluation 1

Session Overview  Goal: Continuous Improvement  Objective 1. Identifying Evaluation Strategies to Establish Evidence Based Practices  Objective 2. Moving Activities from Emerging to Best Practice through Evaluation  Objective 3. Communication – Sites, State, and SSW  Wrap-Up  Evaluation Round Table  Walk away with evaluation tools to take your work to the next level. 2

Evaluation – Data Collection – Assessments – Best Practices – Methodology?????????? 3

Evaluation Goals  Short Term Goals. Within the next six months, the Evaluation Team will:  Provide technical assistance to University Program Partners in capturing service data within MI TRACKR.  Provide intensive regional MI TRACKR training sessions on quarterly basis.  Format data and/or data files -create a unified roster to prepare data for analysis, -conduct an initial analysis of program service data from 8 th and 9 th grade cohort. 4

Short-Term Goals continued  Continuously update the MI GEAR UP website with relevant information, in particular, materials to support MI GEAR UP Program Partner evaluation strategies such as toolkits, templates, workshop resources and other relevant materials.  Provide technical assistance regarding: -Identifying Evaluation Strategies to Establish Evidence Based Practices. -Moving Activities from Emerging to Best Practice through Evaluation. -Communication of Best Practices  Assist MI GEAR UP Program Partners in developing evaluation and survey instruments directly aligned to service delivery objectives and goals. 5

Longer Term Evaluation Goals  Develop a detailed Evaluation Plan that directly reflects the state evaluation research questions.  Complete a detailed analysis of MI GEAR UP services in relationship to MI GEAR UP goals, objectives, and outcomes.  Develop a strategic plan for continuous program improvement through evaluation strategies  Publish academic articles to promote MI GEAR UP Best Practices. 6

Objective 1. Identifying Evaluation Strategies to Establish Evidence Based Practices  Review Emerging to Best Practice components.  Identify real life evaluation strategies  Evaluate these strategies and apply them to your MI GEAR UP programs.  Select one of your signature practices as either an Emerging, Promising or Best Practice.  Identify missing components. 7

Objective 1. Identifying Evaluation Strategies continued  Is your practice replicable? -Is there a lesson plan or template?  Is your practice supported by strong qualitative and quantitative data showing positive outcomes? -Is there a survey or other evaluation instrument to provide the data? 8

Emerging  Promising  Best Practices Incorporates qualities of other positive/effective interventions. Is based on patterns that have been proven to lead to effective outcomes. Incorporates a process of continual quality improvement Has evaluation plan in place Emerging Practice In addition to the qualities of an emerging practice, has strong qualitative and quantitative data showing positive outcomes Promising Practice Has undergone a rigorous process of peer review and evaluation that indicates effectiveness Best Practice Source: “Emerging, Promising, and Best Practices Definitions” from the Kentucky Cabinet for Health and Family Services 9

Identify Real Life Evaluation Strategies  Learn to bake a delicious Whipping Cream Pound Cake  Can you replicate this pound cake with the information given in the clip? If not, what is missing? 10

Identify Real Life Evaluation Strategies  Have you ever booked a hotel room, bought a house or a car?  Identify evaluation strategies you would use to determine which hotel room, house or car to select? 11

Identify Real Life Evaluation Strategies (continued)  Are you using a Cost Benefit Analysis or Criterion Based Matrix to assess whether a purchase is worth your money? a) Cost vs. Location b) Cost vs. Size c) Cost vs. Benefits d) Cost vs. Amenities e) Cost vs. Appearance By using these strategies, you are actually conducting a small scale evaluation! NOW, LET’S TAKE THIS EXAMPLE AND APPLY IT TO MI GEAR UP. 12

Objective 1. Identify Evaluation Strategies to Establish Evidence-Based Practices.  Like a recipe, your lesson plan serves as the tool or blue print for replication.  A lesson plan is one of the components for establishing a Best Practice.  If your practice is replicable, it can reliably produce strong qualitative and quantitative data supporting positive outcomes. 13

Objective 1. Identify Evaluation Strategies to Establish Evidence-Based Practices. Using a Criterion Based Matrix or Cost Benefit Analysis for your practice can demonstrate significant positive outcomes. Program Services Delivery Time & Money (PSDT & M) vs. the following program goals:  Increased understanding of objectives and goals  Increased GPA  Increased standardized test scores  Increased understanding of college affordability  Increased FAFSA completion rates  Promotion to the next grade level  Increased graduation rates Information from the analysis will allow you to continuously improve the variables to achieve better outcomes. 14

Evaluation Program Workshop: Emerging/Promising/Best Practice  Please place your practice on the Emerging/Promising/ Best spectrum. 1.Identify which the practice components you currently have in place. 2.Identify which the practice components you would like to establish. 3.During the roundtable discussion, you may share which evaluation strategies you are currently using for continuous program improvement. 15

Objective 1 Worksheet. Identifying Evaluation Strategies to Establish Evidence-Based Practices Program Partner:__________________________________________ Signature Practice: ________________________________________ Directions: Place Your Practice on the Emerging/Promising/Best Practice spectrum. Use information from your previously completed Characteristics of Promising/Best Practices document to complete the following: 1.Identify which of the seven Promising/Best Practice components you currently have in place and describe how those components apply to your practice. 2.Identify which of the seven Promising/Best Practice components you would like to establish and describe the necessary actions steps to implement them. 16

Take Your Selected Practice and Use It for Continuous Program Improvement. 1.A Promising Practice requires strong qualitative and quantitative data showing positive outcomes. Acquiring this data involves both a lesson plan for replicability and an evaluation tool to demonstrate positive outcomes. 2.Identify the components of a lesson plan to support the replicability of your practice. 3.Identify the components of an evaluation instrument to support qualitative and quantitative data supporting the activity as a promising practice. 17

Objective 2. Moving Activities from Emerging to Best Practice through Evaluation  Moving from Emerging to Best Practices is an ongoing activity. The Evaluation Team is available to provide the following assistance:  Development of lesson plan templates  Draft survey instruments and other evaluation tools 18

Objective 2 Worksheet. Moving Activities from Emerging to Best Practice through Evaluation. Program Partner: ________________________________ Signature Practice: _______________________________ Directions: A lesson plan is one of the components for establishing a Best Practice. If your practice is replicable, it can reliably produce strong qualitative and quantitative data supporting positive outcomes. To effectively replicate a practice, a lesson plan should include the following components:  Goal: What the student will have learned by end of the activity.  Objective: Measures for whether the goal has been achieved  Activity: Step-by-step description of the practice  Assessment: Measures whether goal has been achieved 1.Identify the lesson components for your signature practice: 19

Objective 2 Worksheet. Moving Activities from Emerging to Best Practice through Evaluation. Program Partner: ________________________________ Signature Practice: _______________________________ Directions: Once you have a replicable practice, you will want to develop and implement an evaluation tool to capture qualitative and quantitative data supporting positive program outcomes. 2.Identify an evaluation tool to capture data from your practice and describe how the evaluation would be implemented: 20

Objective 3. Communication of Evaluation Results  Sites  Ongoing program improvement  Share program success with local stakeholders  State  Share program success with national and state stakeholders  Inform programmatic decisions and resource allocation  SSW  Publish program outcomes. 21

Action Steps 1.Identify one to two signature program services that will be transformed to a Best Practice. 2.Continuously improve those services through evaluation strategies. 3.Develop and implement the lesson plan for replicability. 22

Action Steps 4.Develop evaluation tools, in cooperation with the Evaluation Team, to support your practice with quantitative and qualitative data. 5.Communicate your improvements and signature work with the Evaluation Team, so that the work may be published. 23

Wrap Up 24

Evaluation Roundtable Discussion  Handout for Discussion Questions is provided. 25