Logic Model, Rubrics & Data collection tools

Slides:



Advertisements
Similar presentations
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Advertisements

2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
IHS Special Diabetes Program Competitive Grants Part 2: Refining Idea Maps Diabetes Prevention Planning Cynthia C. Phillips, Ph.D. Lisa Wyatt Knowlton,
How to Write Goals and Objectives
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Overall Teacher Judgements
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
The Evaluation Plan.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Student Achievement Teacher and Leader Effectiveness Principal Professional Growth and Effectiveness System Overview.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
THIS PRESENTATION IS INTENDED AS ONE COMPLETE PRESENTATION. HOWEVER, IT IS DIVIDED INTO 3 PARTS IN ORDER TO FACILITATE EASIER DOWNLOADING AND VIEWING,
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Classroom Assessments Checklists, Rating Scales, and Rubrics
GFC M&E 101 THE BASICS OF MONITORING AND EVALUATION AT GFC
Project monitoring and evaluation
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Managing Projects for Success at the RF
Facilitation Tool: Team Agreement
Impact-Oriented Project Planning
Developing a Positive Identity
Fullerton College SLOA Workshop:
VASSP Conference – June 2016
EVALUATION TO IMPROVE—AS WELL AS PROVE
Sales Leader Coaching Skill Development Plan
Program Logic Models Clarifying Your Theory of Change
Introduction to Program Evaluation
Classroom Assessments Checklists, Rating Scales, and Rubrics
Why bother – is this not the English Department’s job?
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Assessment planning: It’s the parts that make up the whole!
Learning Forward Annual Conference Session F28
Measuring Project Performance: Tips and Tools to Showcase Your Results
Brief Action Planning (BAP)
Perspective Interview: Sofia Perez
Strategic Planning Setting Direction Retreat
Looking at your program data
Community Input Discussions:
Transition: Preparing for Life after High School
School Improvement Plans and School Data Teams
Presented by: Community Planning & Advocacy Council.
Evaluation Jacqui McDowell.
Logic Models and Theory of Change Models: Defining and Telling Apart
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Evaluating the Quality of Student Achievement Objectives
Parent-Teacher Partnerships for Student Success
Strategic Planning Strategic Cancer Initiatives
2018 OSEP Project Directors’ Conference
Answering DBQs Some Ideas
Assuring the Quality of your COSF Data
CATCH Early Childhood Classroom Curriculum
4.2 Identify intervention outputs
Chicago Public Schools
Grantee Guide to Project Performance Measurement
Assessing Administrative and Educational support units
Strategic Plan: Tri-Cities High School
CATHCA National Conference 2018
National Food Service Management Institute
Response to Instruction/Intervention (RtI) for Parents and Community
Response to Instruction/Intervention (RtI) for Parents and Community
Sales Leader Coaching Skill Development Plan
IDEA Student Ratings of Instruction
Changing the Game The Logic Model
Thinking through PROGRAMME Theory
CORE 3: Unit 3 - Part D Change depends on…
Assessing Students With Disabilities: IDEA and NCLB Working Together
Beyond The Bake Sale Basic Ingredients
Assuring the Quality of your COSF Data
Presentation transcript:

Logic Model, Rubrics & Data collection tools Evaluation Logic Model, Rubrics & Data collection tools

WHY IS IT GREAT TO HAVE A LOGIC MODEL?? If developed and reviewed regularly, it gets everyone on the same page, and working together It provides a roadmap for designing evaluation activities that can help you assess your progress and success and inform plans for increasing effectiveness.

Logic Model Components: Resources/inputs Activities Outputs Outcomes (short & long term)

Resources & Inputs The things and people that you need to operate your program: Gallery space Grants and donations Dedicated staff Trainings, etc.

Activities What you are doing when your program is happening Classes Counseling Home visits Events Etc

Outputs Observable things that the activities produce: # of participants # of events # of sessions held # of meals served # of risk assessments, etc *Outputs are the evidence that you did what you had planned to do A good way to know for sure whether something is an output vs. an outcome is to ask this question: "Can I prove that this happened due to an activity we did, and not because of something else?" If the answer is yes (such as, you are holding survey results in your hand that came from the survey you administered), it is an output. If the answer is no (such as, your coworkers reported better job morale in a follow-up survey)… it is an outcome.

Outcomes Changes in participant’s: behavior knowledge Skills level of functioning Outlook life trajectories. Are the evidence that your program is having the hoped-for success in making a difference Consider what behavior or decision making changes you expect the targeted community/audience to make after participation

What you hope participants take with them into the future Short-term Outcomes: What you want to see for a participant on the day he or she completes your program Long-term Outcomes: What you hope participants take with them into the future *Create outcomes that are in your sphere of influence. You need to be willing to be held accountable for your outcomes Make sure we are collecting the appropriate data to measure what matters to our program

Assumptions & External Factors Anything that needs to be true for your program to work. Why you think your program will lead to the desired outcomes *Assumptions are strongest when they are backed by evidence in the academic literature External Factors Comprise the setting in which your program operates Some external factors may make the hoped-for outcomes more likely Other external factors may impede your program’s success Assumptions- Logic Model assumptions should have support (Usually when building a logic model for a program one would’ve researched evidence-based programs that will be used or researched the topic) Example of an assumption: Maybe your program rests on the assumption that people are capable of change once they recognize their triggers and learn self-regulation skills. Assumptions for a program trying to decrease childhood obesity might include: Nutrition education impacts obesity levels Residents want information about healthy eating

Logic Model

Rubrics Used to describe what success looks like by describing levels of performance in relation to criteria along a spectrum from poor to excellent Create a Rubric for each Short-term Outcome Name levels of achievement Describe each level *Rubrics take time, dedication and commitment. Time spent developing a rubric increases commitment and ownership by staff

Name levels of achievement towards that outcome Emerging Developing Achieving Extending Novice Apprentice Master Expert Failed Survived Succeeded Thrived Dormant Activated Energized Leader Beginning Acceptable Accomplished Stellar Below standard At standard Above standard Not happening at all Happening a little Happening pretty darn good Awesomely happening Weak Decent Strong Exemplary The highlighted options fit in with our quarterly department evaluation so it might be easier to understand

Nuestra Vida SAMPLE Rubric Outcome: By completion of the program, participants increase knowledge of how to manage their diabetes Level 1 Level 2 Level 3 Level 4   Basic Decent Strong Excellent Participants attend classes but not consistent Data doesn’t reflect purpose of intended outcome Facilitator does not follow fidelity of the curriculum No new registered participants Participants attend classes regularly but still not weekly Data is inconsistent in demonstrating an increase in knowledge based questions Facilitator is inconsistent with curriculum material Graduated participants start a support group Participants attend classes consistently on a regular basis (attend all classes) Data is complete and represents a consistent increase in knowledge based questions Facilitator follows fidelity of the curriculum (strictly) Support group meets regularly Food Diaries/Log Participants attend classes weekly and bring family/friends Data is accurate and represents a statistically significant increase in knowledge based questions Facilitator increases their own knowledge regularly to improve curriculum Support group facilitates classes for new participants

Who should be involved The more people involved the better Everyone with a contribution in the project’s outcome Groups: Create a better rubric, more ideas Builds enthusiasm for evaluation Builds enthusiasm for work of the organization

Data Always consider types of data and ease of collection for each data source in general Administrative records Clinical assessments, tests already being used Surveys Interviews & focus groups *Data tells us to what degree we are achieving meaningful and measurable outcomes If you want to make claims about outcomes, data needs to be representative No need to reinvent the wheel! Look for validated instruments that get at the outcome you want to measure Established validated instruments, national survey, or research studies Always think and consider how easy/hard it will be to collect data needed Beware of low response rates You want response rates of 80% or higher

Data collection tools

Best Practices Agree and set regularly scheduled time for staff to dedicate to Mission Time for development of Logic Model & Rubric Review and refine your Logic Model as needed

Questions