Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,

Slides:



Advertisements
Similar presentations
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Advertisements

Introduction to Monitoring and Evaluation
A Roadmap to Successful Implementation Management Plans.
Audrey Desjarlais, Signetwork Coordinator Survey Findings SPDG Initiative Goals SPDG Initiative Outcomes.
An Introduction to the “new” NCDB …a webinar for the National Deaf-Blind TA Network November 13, 2013 November 15, 2013 Presented by:
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
Striving for Alignment: One Funder's Lessons in Supporting Advocacy.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
Orientation to Performance and Quality Improvement Plan
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
 AKA CIPP  Evaluators: Elaine Carlson and Tom Munk  Assist in summative evaluation of the center  Helped develop standardized logic model  Helped.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
New England Regional Colloquium Series “Systems of State Support” B. Keith Speers January 24, 2007.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
RTI Center Presentation for the SPDGs January 27, 2009 RTI Center Presentation for the SPDGs January 27, 2009 Amy Elledge Tessie Rose.
The Evaluation Plan.
The RRCP Program A Framework for Change Presented to our SPDG Partners June 2010.
Legal Services Corporation: Evaluation Assistance to TIG Grantees May 4, 2004 Please standby while others connect.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Citizens Redistricting Commission Civic Engagement Proposal February 11, 2011 Center for Collaborative Policy, California State University, Sacramento.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
United We Ride: Where are we Going? December 11, 2013 Rik Opstelten United We Ride Program Analyst.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
1 General Supervision. 2 General Supervision (and Continuous Improvement) 1.What are the minimum Components for General Supervision ? 2.How do the Components.
THE UNITED REPUBLIC OF TANZANIA Millennium Development Goals (MDGs) Monitoring Workshop Kampala Uganda, 5 th - 8 th May 2008 COORDINATION OF NATIONAL STATISTICAL.
Family & Professional Networks in Disability Policy: A Qualitative Inquiry.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
Get Your "Party" Started: Establishing a Successful Third-party Evaluation Martha Thurlow, Ph.D. & Vitaliy Shyyan, Ph.D.—National Center on Educational.
Increasing Momentum in the Formation of State and Regional Monitoring Councils Linda Green, co-chair, Collaboration and Outreach Workgroup, National Water.
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Promoting a Culture of Evidence Through Program Evaluation Patti Bourexis, Ph.D. Principal Researcher The Study Group Inc. OSEP Project Directors’ Conference.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
SPDG Levels of Collaboration
AUCD’s TA Contract with ADD UCEDD Directors Meeting October 30, 2006.
State of Georgia Release Management Training
Session 2: Developing a Comprehensive M&E Work Plan.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
A Collaborative Mixed-Method Evaluation of a Multi-Site Student Behavior, School Culture, and Climate Program.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
University Career Services Committee
Pat Mueller, EEC David Merves, EEC Vitaliy Shyyan, NCEO
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
2018 OSEP Project Directors’ Conference
Grantee Guide to Project Performance Measurement
Performance and Quality Improvement
Measuring Child and Family Outcomes Conference August 2008
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Presentation transcript:

Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21, 2009 Presentation by: Thomas Fiore, Ph.D. Project Director Center to Improve Project Performance Westat

2 Session Theme 2 Evaluation tools and processes can help project directors do the best job with project planning, implementation, and accountability.

3 Presentation Outline Presentation outline Keeping a project focused Logic model as the starting point Evaluation plan Evaluation design Evaluation implementation CIPP’s role providing general TA CIPP contact information

4 Keeping a Project Focused Project purpose is to do a good job of the right thing. Need data to document the activities (outputs) of a project. Need data to document the overall success or lack of success (outcomes) of a project. Most important, need data to determine whether innovations should be scaled up, changed, or abandoned. Need a tool to understand which data are important—that tool is a logic model.

5 Logic Model as the Starting Point A logic model... Portrays a project’s overall plan; Clarifies the relationships among a project’s goals, activities, outputs, and outcomes; and Displays the connections between those defining features of a project.

6 Logic Model Structure Create a coordination hub where the various TA&D centers funded by OSEP and other federal agencies can find and share information and resources, collaborate and problem-solve together. Provide logistical support for coordination, communication and collaboration  maintaining/ expanding workgroups  developing listservs  maintaining and updating an integrated events calendar Number/types of workgroups created and maintained Number/types of listservs and listserv participants Up-to-date events calendar Goals Strategies/ Activities Outputs DirectIntermediateLong-term Outcomes An effective single point of entry for network resources is implemented/ maintained Beneficial connections exist among Network participants Successful ways of locating and sharing information and resources among network members and others are implemented /maintained The capacity of the TA&D Network members to serve clients increases steadily Inputs OSEP funding Project staff Prior experience Research-based policy and practices External Factors/Context: Other federal initiatives; OSEP policy environment; grantee’s accumulated experience and visibility.

7 Logic Model Thus, logic models can be used as a starting point to plan data collection and analysis aimed at measuring project processes and performance. Systematically measuring project processes and performance is evaluation. A logic model implies a causal relationship that flows from goals to outcomes. Evaluation can be viewed as a test of the logic model’s implied hypotheses of this causal relationship.

8 Evaluation Plan From the logic model, develop a plan for collecting and analyzing data. Focus on outputs and outcomes Useful for formative [define] or summative purposes [define], or both

9 Evaluation Plan Develop the specific plan by answering these questions: What data are required to demonstrate project effectiveness or to provide information on the overall program effectiveness? What are the strategies/activities that should be given priority—that is, which ones should be evaluated because they are important? Who are the targeted recipients of interest and in what settings?

10 Evaluation Plan Develop the specific plan by answering these questions: [more] What are the evaluation questions? What data collection activities are needed? How will the data be analyzed? What are the necessary timelines, staff assignments, and cost allocations across years? How will this be documented and reported?

11 Evaluation Plan Logic model leads to evaluation questions: → Relevant goals (not necessarily all) → Salient strategies/activities related to those goals → Outputs associated with the strategies/activities → Outcomes (the most consequential ones) → Evaluation questions

12 Evaluation Plan Goal: Create a coordination hub… Strategies/activities: Provide logistical support… Outputs: Number/types of workgroups, etc. Outcomes: Successful ways of locating and sharing information and resources… Evaluation question: To what extent have activities supporting coordination, communication, and collaboration been effective in enabling TA&D Network members to do their work efficiently and without duplication?

13 Evaluation Design Evaluation plan leads to an evaluation design: → Evaluation questions → Measurable outputs or outcomes → Methods that capture change → Types of data collection → Instruments

14 Evaluation Design Evaluation question: To what extent have activities…been effective in enabling TA&D Network members to do their work efficiently and without duplication? Measurable outcome: An integrated technology system is implemented and used. Types of data collection: Record review, survey Instruments: Record Review and Web Statistics Protocol, Annual TA&D Network Survey

15 Evaluation Design Evaluation design continues with: Instrument development Sampling Data collection scheduling

16 Evaluation Implementation Evaluation implementation requires: Data Collection Analysis Reporting

17 Summary Sounds complicated, but projects are doing much of this already. Best when integrated into the overall implementation of the project. Best when formative and summative evaluations are integrated. Doesn’t need to be comprehensive—don’t need to measure everything in the most rigorous way to have information that can be useful. Overall, evaluation answers the question of great interest to funders and to clients: What good is this doing?

18 CIPP’s Role CIPP role is two-fold: To guide, coordinate, and oversee the summative evaluations of 12 large grant-funded projects selected by OSEP. To provide technical assistance to current OSEP grantees in conducting formative and summative evaluations of their projects funded through the following programs: Parent Information Centers, Technical Assistance and Dissemination, Personnel Development, and Technology and Media Services.

19 CIPP’s Role Providing General TA Vehicles for accessing TA: CIPP website ( provides an opportunity for staff of OSEP-funded projects to pose questions or ask for assistance. ◦ CIPP website contains information and reference materials on formative and summative evaluation. ◦ Evaluation Briefs will be available that discuss information requested with some frequency and that can be downloaded from the website. Toll-free telephone line ( ) also enables project staff to pose questions or ask for assistance.

20 CIPP Contact Information Website: Toll-free telephone: