Introduction to Evaluation January 26, 2005. Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Introduction to Monitoring and Evaluation
Educational Specialists Performance Evaluation System
Note: Lists provided by the Conference Board of Canada
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Data Collection* Presenter: Octavia Kuransky, MSP.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Evaluation. Practical Evaluation Michael Quinn Patton.
1 Assessment and Evaluation for Online Courses Associate Professor Dr. Annabel Bhamani Kajornboon CULI’s 6 th Intl Conference: Facing.
Co-op Development Training Program Starting September, 2011 Information Session July 8, 2011.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Program Evaluation Using qualitative & qualitative methods.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
The Evaluation Plan.
1 School Counseling PowerPoint produced by Melinda Haley, M.S., New Mexico State University. “This multimedia product and its contents are protected under.
Legal Services Corporation: Evaluation Assistance to TIG Grantees May 4, 2004 Please standby while others connect.
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
WRITING EFFECTIVE GRANT PROPOSALS With an Eye toward Performance and Evaluation Issues.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Evaluating Advocacy: A Model for Public Policy Initiatives.
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
EVALUATION PLANNING Legal Services Corporation May 19, 2004 Please standby while others connect.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Community Board Orientation 6- Community Board Orientation 6-1.
Quality Assessment July 31, 2006 Informing Practice.
Center for Leadership Development Guarantee The Money: Making Your Case Through Program Evaluation.
Program Evaluation for Nonprofit Professionals Unit 2: Creating an Evaluation Plan.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Innovation Network, Inc. Program Logic Model Program Planning Communication Evaluation.
Program Evaluation DR. MAJED WADI. Objectives  Design necessary parameters used for program evaluation  Accept different views of program evaluation.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Catholic Charities Performance and Quality Improvement (PQI)
The Curriculum Development Process Dr. M
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
PLMLC Leadership Series Thunder Bay Region Day 1 Brian Harrison, YRDSB Connie Quadrini, YCDSB Thursday February 3 rd, 2011.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Accountability & Effectiveness Innovation Network, Inc. January 22, 2003.
Point K Learning Center: Wednesday, November 5, :30 – 6 pm.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Focus Questions What is assessment?
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Evaluating Advocacy: A Model for Public Policy Initiatives
FRAMEWORK OF EVALUATION QUESTIONS
Project Title: (PEARS Action Plan-Step 1)
Introduction to the PRISM Framework
Resources Activity Measures Outcomes
Assessment of Service Outcomes
Data Collection: Designing an Observational System
Presentation transcript:

Introduction to Evaluation January 26, 2005

Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed to evaluation as a tool for empowerment Build evaluation capacity of nonprofits and funders so they can better serve their communities Practice a participatory approach

Slide 3 Innovation Network, Inc. Objectives of This Session Identify how evaluation can be useful in your work Understand both implementation and outcome evaluation Understand how evaluation planning feeds into data collection

Slide 4 Innovation Network, Inc. What is Evaluation? The systematic collection of information about a program that enables stakeholders to better understand the program, improve program effectiveness, and/or make decisions about future programming.

Slide 5 Innovation Network, Inc. What’s in it for you? Understand and improve your program Test your theory of change/program theory Tell your program’s story Be accountable Inform the field Support fundraising efforts

Slide 6 Innovation Network, Inc. Evaluation Principles Evaluation is most effective when it: Is connected to program planning and delivery Involves the participation of stakeholders Supports an organization’s capacity to learn and reflect Respects the community served by the program Enables the collection of the most information with the least effort

Slide 7 Innovation Network, Inc. Implementation and Outcomes Evaluating Implementation/Process : What did you do? How well did you do it? Evaluating Outcomes : What difference did you make through your work? Or, what changes occurred because of your work?

Slide 8 Innovation Network, Inc. Evaluating Outcomes Outcomes : the changes you expect to see as a result of your work Indicators : the specific, measurable characteristics or changes that represent achievement of an outcome. They answer the question: How will I know it?

Slide 9 Innovation Network, Inc. Evaluating Outcomes: Common Types of Change New knowledge Increased skills Changed attitudes, opinions or values Changed motivation or aspirations Modified behavior Changed decisions Changed policies Changed conditions

Slide 10 Innovation Network, Inc. Evaluating Outcomes: Sample Plan OutcomesIndicatorsData Collection Method Data Collection Effort (have, low, med, high) Participants improve job- seeking skills #/% of participants who meet criteria in mock interview Observation of mock interview at end of training session using checklist Low #/% of participants who develop a quality resume Job counselors review resumes based on quality checklist Have Participants improve money management skills #/% of participants who balance their checkbooks Review participants’ check registers & work sheets Low #/% of participants who pay credit card bills in full & on time Review participants’ credit card statements Low

Slide 11 Innovation Network, Inc. Evaluating Implementation Activities and Outputs: The “what” —the work you did, and the tangible products of that work Additional Questions: The “why”— understanding how well you did, and why

Slide 12 Innovation Network, Inc. Evaluating Implementation: What Did You Do? Examine Activities and Outputs Did you conduct activities as planned? Did those activities produce the outputs you envisioned? How to measure? Program documents

Slide 13 Innovation Network, Inc. Evaluating Implementation: How well did you do it? What information will help you understand your program implementation? Think about:: Participation Quality Satisfaction Context How to measure? Program documents, surveys, interviews, comment functions, focus groups, other methods

Slide 14 Innovation Network, Inc. Evaluating Implementation: Sample Plan ActivitiesOutputs & Implementation Questions Data Collection Method Data Collection Effort (have, low, med, high) TRAINING Develop/revise curriculum for training series Meet with potential program clients Coordinate logistics Provide training series to two groups of clients Outputs Curriculum (developed/revised) 2 training series held Completion by 30 of 33 Program records Program records/attendance logs Have Questions Are we reaching the clients we expected? Review of participant intake data Low

Slide 15 Innovation Network, Inc. Data Collection: 3 Steps Choose the method Decide which people or records will be the source of the information Determine the level of effort involved in using that method with that population

Slide 16 Innovation Network, Inc. Data Collection Steps 1 & 2: Choose Method, Identify Source Review documents Observe Talk to people Collect written responses Pictorial/multimedia

Slide 17 Innovation Network, Inc. Data Collection Step 3: Level of Effort Instrument development Cost/practicality of actually collecting data Cost of analyzing and presenting data Also consider: Communication Power Proxy Power

Slide 18 Innovation Network, Inc. Good Data Collection Characteristics: Culturally appropriate (survey, focus group, who asks, manner of asking) Ethical Respectful of participants

Slide 19 Innovation Network, Inc. Continuous Learning Cycle Program Plan Analysis, Reflection & Improvement Evaluation Planning Data Collection

Slide 20 Innovation Network, Inc. Next Steps We can do more! Data Collection and Analysis Jan 9 am Online, instructor-led trainings Online, self-paced modules Individual technical assistance

Slide 21 Innovation Network, Inc. Thanks for Your Participation! Measure results. Make informed decisions. Create lasting change. Innovation Network, Inc K St. NW, 11 th Floor Washington, DC (202) Ehren Reed: