Continuous Improvement Collecting, Analyzing, and Sharing Data.

Slides:



Advertisements
Similar presentations
Building a Career Portfolio
Advertisements

Evaluating and Measuring Impact Presented by – Date –
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
The Next Challenge: Collecting Completion Data. 2 July 2011 This presentation addresses several questions.  What data do grantees provide for the Completion.
Designing an Effective Evaluation Strategy
PROGRAM EVALUATION Carol Davis. Purpose of Evaluation The purpose of evaluation is to understand whether or not the program is achieving the intended.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
Enhancing Education Through Technology Round 9 Competitive.
Using Quarterly Report Data OR I’ve Been Collecting Systems Data All Along and Didn’t Know It!
An Assessment Primer Fall 2007 Click here to begin.
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Customer Focus Module Preview
© 2007 GiftCertificates.com Corporation. All rights reserved. SuperCertificate® Reward is a registered trademark of GiftCertificates.com Corporation. Merchants.
Chapter 3: Data Collection: Process and Outcome Data In this Chapter You will Learn: Tasks you Will Undertake with Guidance from this Chapter Are:
How to Develop the Right Research Questions for Program Evaluation
Process: A Generic View
Data Integration Project. MoSTEMWINS Data Projects Strategy 1 -- Develop and Implement a statewide data system in support of tracking student performance.
 1. Methods of evaluation are thorough, feasible, and appropriate  2. Use of objective measures to produce quantitative and qualitative data  3. Methods.
A Forum on Comprehensive Community Initiatives How Federal Agencies Can Foster Systems Change to Improve the Lives of Youth and Families Welcome to.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Tools to Assess the Needs of Your Plus 50 Population American Association of Community Colleges Funded with a grant from The Atlantic Philanthropies 1.
McLean & Company1 Improving Business Satisfaction Moving from Measurement to Action.
THE HAPPINESS INITIATIVE TOOLKIT A Tool kit for creating your own Happiness Initiative How To Conduct A Happiness Initiative in your city or town.
The Evaluation Plan.
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Section Market Research & Development
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Surveying patrons with the Impact Survey A fast, easy way to gather feedback from the community about public technology needs Samantha Becker, MLIS, MPA.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Market Research Lesson 6. Objectives Outline the five major steps in the market research process Describe how surveys can be used to learn about customer.
Welcome! Please join us via teleconference: Phone: Code:
Engaging Business and Industry: The Pathways to Prosperity Approach April 19, 2013 John Kirkman, Regional Coordinator.
Strategic Planning A Tool for “Charting our Future” Strategic Planning Team May 24-25, 2012.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
Sheri Ginett, IIDA, NCIDQ, Interior Architects Carol Fadden, Duke Energy September 9, 2015 OPERATIONAL EXCELLENCE PROGRESSIVE + STRATEGIC WORKPLACE.
PLANNING WORKBOOK TUTORIAL MODULE 3 STEPS FOR DEVELOPING ROADWAY USER AWARENESS AND EDUCATION PROGRAMS FHWA Highway Safety Marketing, Communications, and.
© 2005 course technology1 1 1 University Of Palestine UML for The IT Business Analyst A practical guide to Object Oriented Requirement Gathering Hoard.
1 Chapter 4 Analyzing End-to-End Business Processes.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
Goal Setting and Continuous Improvement.  What will be the goals you set that make a difference for your customers?  What role will you play?  With.
Senior Capstone Experience Framework A Guide for South Dakota Schools.
3 PURPOSES OF RESEARCH. Exploration Social Scientific Research 1. Satisfy curiosity 2. Test Feasibility 3. Develop reasons for further study.
Customer Focus Module Preview This PowerPoint provides a sample of the Strategic Planning Module PowerPoint. The actual Strategic Planning PowerPoint is.
Read to Learn Describe the kinds of market research a company may use. Identify the steps in developing a new product.
Read to Learn Describe the kinds of market research a company may use. Identify the steps in developing a new product.
Building the Parent Voice
Introduction to Business Chapter 19 Planning a Career Essential Question: How can I best plan for a career?
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Defining Clear Goals and ObjectivesDefining Clear Goals and Objectives Barbara A. Howell, M.A. PAD 5850.
ASSESSMENT and EVALUATION (seeing through the jargon and figuring out how to use the tools)
Session 2: Developing a Comprehensive M&E Work Plan.
How Good are you at Managing your Processes? Operational Excellence.
.. Requires ample preparation in lower grades Success in college is directly related to success in earlier grades Each child should have the necessary.
Managing Talent – Maximizing Your Employee’s Potential 3 rd SACCO LEADERS’ FORUM Monique DunbarLorri Lochrie Communicating Arts Credit UnionCentral 1 Credit.
1 Identifying Instruction-Related Research Issues Deborah Lines Andersen School of Information Science and Policy University at Albany June 26, 2004.
Publisher to insert cover image here Chapter 9
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
IT Governance Planning Overview
Developing an Evaluation Plan
Outcomes and Equity Assessment Workshop Overview
Developing an Evaluation Plan
Turning experience into employment
Presentation transcript:

Continuous Improvement Collecting, Analyzing, and Sharing Data

At the end of this presentation, you will know: 1)All the wonderful ways that you can use data for important purposes 2)What “formative evaluation” is, and how it relates to continuous improvement 3)How to organize yourselves for formative evaluation 4)How to collect data and use it to fuel continuous improvement 5)Different approaches to analyzing, reporting on, reflecting on, and sharing your data 6)How to work with C-PAD to accomplish data- related tasks

The Joys of Data!

Data are not just for grant requirements anymore! Continuous Improvement Marketing your Program Positioning your Program for Funding

Participation Satisfaction Tailoring Effectiveness By using data for continuous improvement, you will increase:

Using Formative Evaluation for Continuous Improvement

Evaluation = using data for assessment and learning Formative = for the purposes of creating, developing, and improving Formative (program) evaluation = Using data to inform how you can continually pursue program excellence What is formative evaluation?

Bullet Points Formative Evaluation as a Tool for Continuous Improvement Formative evaluation asks: So that your team can further investigate: In order to achieve: Is the program reaching its participation targets? How can we attract more people to the program? Increased participation Are participants satisfied with the offerings? How can we increase satisfaction levels? Increased satisfaction Does the design of services meet the needs of its participants? How can we better align service offerings with participant needs? Increased tailoring Do participants succeed in desired outcomes? How can we increase the rate of positive outcomes? Increased effectiveness

Forget: “evaluation” Remember: “leveraging data for program excellence!” We are busy delivering a program – how can we be evaluators too?

Julie Seeley, Spoon River College Story from the Field: Leveraging data for program excellence

The Formative Evaluation Toolkit: A Guided Tour

Many pages (but not to worry!) Step-by-step guidance through the phases of evaluation Tools and templates for conducting your evaluation Templates for working meetings to accomplish evaluation tasks Checklists at the end of each chapter to keep you on track Blank pages for notes, questions, and insights Support for: –Completing deliverables due to Champion Colleges –Fulfilling grantee reporting requirements What you will find in the FET:

Data collection (chapters 3 & 4) Organizing for evaluation (chapter 2) Reporting on findings (chapter 5) Reflecting on findings (chapter 5) Step-by-step guidance for:

Data Collection (Chapters 3 and 4)

Process data Outcome data Stakeholder feedback Data Types

Counts of program participants –Students completing workforce program courses –Students participating in support services Information about program components, describing: –Workforce programs and courses –Math, English, and computer courses Process Data (Chapter 3)

Number of students who have: –Received credits for prior learning –Received a degree –Received a certificate –Received a non-credit certificate –Become employed Outcome Data (Chapter 3)

“Extractable” v. “Real-Time” Data Make a data collection plan –Identify the data you need to collect –Figure out how you’ll collect it Work with an IR Partner Plan early for real-time data collection Stay organized with data storage Important Considerations for Collecting Process & Outcome Data

Guidance for a working meeting to develop process and outcome data collection plans Tools: –Tool 2: Data Choices Table (p. 42) –Tool 3: Process Data Collection Plan Worksheet (p. 45) –Tool 4: Outcome Data Collection Plan Worksheet (p. 54) –Tool 5: Options for Collecting Employment Data (p. 57) –Data Storage Excel Workbook (URL on p. 17) FET Tools for Process and Outcome Data

Hearing from students, internal partners, and community partners Looking for feedback that will help you to: –Attract more students to the program –Tailor services better –Raise satisfaction levels –Support students better for completion and employment Stakeholder Feedback (Chapter 4)

Decide what you want to learn Make a data collection plan –Decide which stakeholders you want to hear from –Identify the data collection methods you want to use –Identify people responsible for collecting feedback Make data collection instruments Don’t collect too much data! Important Considerations for Collecting Stakeholder Feedback

Guidance for two working meetings to develop: –A stakeholder feedback data collection plan –Data collection instruments Tools: –Tool 6: Stakeholder Feedback Evaluation Questions Worksheet (p. 61) –Tool 7: Stakeholder Feedback Data Collection Plan Worksheet (p. 63) –Tools 8 & 9: Survey, Interview, and Focus Group Protocol Templates (pp. 66, 82) –Tools 10-12: Guidance for Conducting Interviews and Focus Groups (pp. 88, 90, 91) FET Tools for Stakeholder Feedback

Organizing for Evaluation (Chapter 2)

Evaluation Team Manager IR Partner Process and Outcome (P&O) Data Lead Stakeholder Feedback Lead Assembling an Evaluation Team

Review how the Plus 50 data tasks fit together Customize your evaluation timeline Convene the Team for an Evaluation Launch Meeting

Reflection and Reporting (Chapter 5)

o Preliminary reporting: pulling together data with initial analysis o Reflecting on findings o Writing up memos:  Findings/Learnings  Recommendations (based on learnings and reflection) Analysis, reflection, and reporting as an iterative process

Guidance for working meetings to reflect: –With the Plus 50 Team on key findings from the process and outcome data –With the Plus 50 Team on key findings from the stakeholder feedback –With the Plus 50 Advisory Committee on all key evaluation findings Tools: –Tool 13: Evaluation Questions Tool (p. 92) –Tool 14: Qualitative Data Analysis Template (p. 95) –Tool 15: Memo Write-up Tool (including an memo outline) (p. 99) FET Tools for Reflecting on and Writing up Evaluation Findings

Working with C-PAD to Support Continuous Improvement

1.Collect Process and Outcome Data 2.Gather Feedback from Program Participants and Partners 3.Share Results with Key Stakeholders 4.Plan for Program Improvement Phase 5

Feeding two birds with one seed: Process and Outcome Data in C-PAD P&O data used for continuous improvement = P&O data used for grant-required reporting to AACC

Questions?