Program Development & Logic Model

Slides:



Advertisements
Similar presentations
Examples of good practice. Examples of Good Practice Use of external UK and/or international external organisations as non-funded partners (target country.
Advertisements

Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION OF NATIONAL & COMMUNITY SERVICE.
1 Performance Measurement Builder [Your Name] [Your Organization] [Contact Information] [todays date:]
1 Instruments and Data Collection New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
ASTM International Officers Training Workshop September 2012 Pat Picariello, Director, Developmental Operations 1 Strategic Planning & New Activity Development.
September 2013 ASTM Officers Training Workshop September 2013 ASTM Officers Training Workshop Strategic Planning & New Activity Development September 2013.
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
TCE Board Presentation February, 2006 Evaluating the Initiative Oakland, CA - Seattle, WA.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
1 Establishing Performance Indicators in Support of The Illinois Commitment Presented to the Illinois Board of Higher Education December 11, 2001.
Individual Professional Development Plan Santa Rosa District Schools School Year.
Using Data to Successfully Drive Your Program: Program Evaluation and Evidence Informed Respite Programs MaryJo Alimena Caruso & Jennifer Abernathy.
APS Teacher Evaluation
1 Title I Program Evaluation Title I Technical Assistance & Networking Session May 23, 2011.
A Roadmap to Successful Implementation Management Plans.
Developing and Implementing a Monitoring & Evaluation Plan
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
Program Design Roger A. Rennekamp, Ph.D. Extension Professor and Specialist in Program and Staff Development Department of Community and Leadership Development.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
1 Welcome to the Title I Annual Meeting for Parents
1 Phase III: Planning Action Developing Improvement Plans.
Campus Improvement Plans
Toolkit Series from the Office of Migrant Education Webinar: SDP Toolkit August 16, 2012.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
© 2005 The Finance Project Module II: Developing a Vision and Results Orientation Oregon 21 st Century Community Learning Center Programs October 2011.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
COLLEGE SPARK WASHINGTON 2012 Community Grants Program Application Webinar 12/22/201110:00 AM 1/4/20122:00 PM.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
Community Sector Governance Capability Framework
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
Building Sustainable Development Oak Island Resort, NS November 18-19, 2005 How to Write a Successful Proposal Lynn Langille Atlantic Health Promotion.
2014 AmeriCorps External Reviewer Training
Planning & Budgeting 24 th August 2011 Dave Hastings.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Results-Based Management
The County Health Rankings & Roadmaps Take Action Cycle.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Rethinking Homelessness Their Future Depends on it!
12/07/20101 Bidder’s Conference Call: ARRA Early On ® Electronic Enhancement to Individualized Family Service Plans (EE-IFSP) Grant and Climb to the Top.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Communication 2 Report Writing.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Framework and Toolkit for UN Coherence, Effectiveness and Relevance at Country Level: Step 7 – Monitor & Evaluate.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
MKT 498 EDU The learning interface/mkt498edudotcom.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Program Design Global Health Fellowship St Luke’s/Roosevelt New York.
What is Planning? Start at 9:15—10 minutes to do this. Finish at 9:25.
Presentation transcript:

Program Development & Logic Model Designing, developing, and evaluating non-profit programs

Program Development/Logic Model When developing a new program, it is especially important that an organization address a few key points when applying for funds to finance the program. The following guidelines cover these crucial steps, which include: Assessment of community needs Program design Performance measurement Program evaluation Mar-17 Office of Faith-Based and Community Initiatives

Community Needs Assessment Before designing a program, it is recommended that an organization conduct a community needs assessment to determine that the program is addressing a need that exists in the community. In order to get an accurate picture, this assessment should involve people representing various aspects of the community. In gathering research, the following questions may serve as a guideline: Who does the problem affect? How many people are affected? How is this problem addressed in other locations affected by it? What has and has not worked? Is this problem already being addressed in this community? How? Continued… Mar-17 Office of Faith-Based and Community Initiatives

Community Needs Assessment Continued A new program should not be replicating services already provided in the community. If the needs assessment determines that the problem is already being addressed adequately by other organizations in the community, it may be best to narrow or broaden the program’s approach to address the problem in a unique manner. Visit other organizations addressing the problem, in the community or not, to assess what approaches do and do not work, and how services can be improved upon. Mar-17 Office of Faith-Based and Community Initiatives

Office of Faith-Based and Community Initiatives Program Design Some basic qualities of a good program are that it: Describes and gives evidence of a community need that can be addressed. Identifies a gap in services available in the community which address the need. Explains why this program is an appropriate strategy to meet the need. Outlines the activities of program members in addressing the need. Establishes community partnerships in the process of addressing the community need. Anticipates the positive outcome of the program in the community. Defines the method of measuring results (see the following section on Performance Measurement). Mar-17 Office of Faith-Based and Community Initiatives

Office of Faith-Based and Community Initiatives The Logic Model A recommended method of program design is to use a logic model to plan and evaluate the proposed program. A logic model is a depiction of the processes and targeted outcomes of the program. This should help the organization to specify goals, identify what resources are needed, identify indicators of progress and measurements of success, and communicate the program’s potential value. Mar-17 Office of Faith-Based and Community Initiatives

The Logic Model – Components The components of a basic logic model include: Needs: The community need as identified in the Community Needs Assessment. Inputs: The resources needed to complete activities. Activities: What staff, volunteers, etc. actually do. Outputs: A measurement of the actual amount of service completed. Intermediate Outcomes: Measurable change and improvements in the program’s constituents and/or community. End Outcomes: The target changes that the organization hopes to achieve in the program’s constituents and/or community. Mar-17 Office of Faith-Based and Community Initiatives

Inter-mediate Outcomes Sample Logic Model Need Inputs Activities Outputs Inter-mediate Outcomes End Outcomes Low third-grade reading performance Staff Books Space Volunteers 20 volunteers will tutor children one-on-one in reading 3 times a week (x) number of children tutored Increased number of books read, increased reading on own Increased reading performance, improved grades Mar-17 Office of Faith-Based and Community Initiatives

Performance Measurement The purpose of performance measures is to capture the ongoing progress the program is making. It should provide a snapshot of the impact of the work that the organization is doing. This is an especially important step in establishing accountability to any funder. Identify all of the services the organization provides and the likely impact these services will have on the community. Mar-17 Office of Faith-Based and Community Initiatives

Performance Measurement – Tracking Tools Determine tracking tools for use in measuring the results of services offered. These measures should focus on goal outputs and outcomes, as identified by the community needs assessment and defined by the program’s logic model. The measures should be in quantifiable terms and clearly defined. A good guide for determining performance measures is the acronym SMART. They should be: Specific Measurable Attainable Realistic Timely Mar-17 Office of Faith-Based and Community Initiatives

Office of Faith-Based and Community Initiatives Program Evaluation The program evaluation should be an analytical study which measures the progress and impact the program has made. Evaluations should be scheduled and carried out throughout the course of the program. This should consist of thorough and objective research conducted by an experienced evaluator, starting at the beginning of the program year. Resources for finding a program evaluator include colleges and universities, research firms, and community organizations. The evaluator should collect data relevant to the program’s activities and programs and evaluate the organization’s impact. The organization should be consistently involved in the evaluation process and should adjust the programs offered according to performance measures and program evaluations in order to best serve the community. Mar-17 Office of Faith-Based and Community Initiatives

Office of Faith-Based and Community Initiatives Final Note Program development is a complicated process, about which this resource offers only a brief overview. For more information on program development and the logic model, please refer to the resources listed on the following slide. Mar-17 Office of Faith-Based and Community Initiatives

Office of Faith-Based and Community Initiatives Additional Resources CNCS Program Toolkit: http://nationalserviceresources.org/filemanager/download/online/sustainability_toolkit.pdf Logic Models: http://www.uwex.edu/ces/pdande/evaluation/powerpt/nutritionconf05.ppt Logic Model Development Guide: http://www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf Logic Model - University of Idaho: http://www.uidaho.edu/extension/LogicModel.pdf Mar-17 Office of Faith-Based and Community Initiatives