Program Design: Analyzing Program Models

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Process Evaluation Susan Kasprzak, March 13, 2009.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute Cal-ABA, 2011.
Module 4: Establishing a Data-based Decision- making System.
M & E for K to 12 BEP in Schools
Experiments and Variables
1 Performance Measurement Community Literacy March 19, 2007 Harry P. Hatry The Urban Institute Washington DC.
LEXIA GAMES: WHAT EFFECT DOES TECHNOLOGY HAVE ON STUDENTS READING COMPREHENSION? Leah G. Doughman University of West Georgia MEDT 8484 Fall 2010.
The Head Start Child Development and Early Learning Framework A Focus on School Readiness for Infant and Toddler Children August 19, 2014 RGV Pre-Service.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
Productivity Prepared by Dr. Manal Moussa. Productivity Prepared by Dr. Manal Moussa.
Observing Learning Helen Bacon and Jan Ridgway Inclusion Support Services.
Measuring Learning Outcomes Evaluation
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
CASIE Workshop Psychology
Ensuring the Presence and Fidelity of Effective Classroom Practices to Increase Positive Student Behavior.
VIRTUAL BUSINESS RETAILING
UNDERSTANDING CUSTOMER REQUIREMENTS
The Evaluation Plan.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
LOUGHBOROUGHCOLLEGE Business Support Self Assessment
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Market Research Lesson 6. Objectives Outline the five major steps in the market research process Describe how surveys can be used to learn about customer.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
TAH Project Evaluation Data Collection Sun Associates.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Welcome to Unit 2! January 13,  Unit 3 Project  APA Questions  Unit 2 Reading  Unit 2 DB.
Chapter 3 Job Analysis 1.Definition and Terminology 2.Importance of Job Analysis 3.The Necessity of Continuous Job Analysis 4.Process of Job Analysis 5.Methods.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Experimental Research Methods in Language Learning Chapter 9 Descriptive Statistics.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Syllabus Design and Resources, Part 1
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
1 EDUCATIONAL TECHNOLOGY Pertemuan Matakuliah: G0454/Class Management and Education Media Tahun: 2006.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
An Introduction to the Research Process Dr Vicente Chua Reyes, Jr.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Quality Performance Measures Presentation Derived from Martin & Kettner’s Measuring the Performance of Human Service Programs, Sage, 1996.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
Online Learning Module: Planning evaluation. Webinar overview  This webinar will be recorded so that it can be available on the Centre’s website as an.
Aviation Security Training Module 3 Conducting an Exercise 1.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
Are we there yet? Evaluating your graduation SiMR.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Classroom Assessments Checklists, Rating Scales, and Rubrics
M&E Basics Miguel Aragon Lopez, MD, MPH
Internal assessment criteria
Classroom Assessments Checklists, Rating Scales, and Rubrics
Controlling Measuring Quality of Patient Care
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Module 7: Monitoring Data: Performance Indicators
Presentation transcript:

Program Design: Analyzing Program Models PS 430 Unit 7 Program Design: Analyzing Program Models

Unit 7 Project View this data for Pathway High School, determine results and analyze results: Minimum of 2 pages APA format including in text citations and references Minimum of 2 academic sources which can include your textbook Determine the purpose of each data requirement above by discussing how it can be used in the program planning process. Identify each type of data collected as either quantitative or qualitative

Unit 7 Project Figure 1: The occurrence of aggressive behavior incidents reported before and after the program began Using the data in Figure 1, discuss the level of challenging behavior incidents reported before and after the program began (include discussion of the need for the program and the general impact the program had on challenging behavior).

Figure 1

Figure 2 Using the data in Figure 2, calculate the mean (average) score on the observational checklist before the program was implemented and after. Use this data to discuss the impact of the program on behavior correction procedures in the classroom. To Calculate the mean Add all the numbers Divide by how many numbers Use excel fx average

Calculate the Mean Scores on the observational checklist was compiled to measure correct implementation of behavior correction procedures before and after program implementation. The scores are as follows: What is the mean score for each condition? Scores on Observational Checklist (out of 100) Before Program Implementation After Program Implementation Teacher A 65 95 Teacher B 85 100 Teacher C 40 75 Teacher D 60 Teacher E 50 80 Teacher F Teacher G Teacher H 90

Figure 3 Using the data in Figure 3, calculate the percentage of students who rated their school experience at the highest level (5) and the lowest level (1). Discuss these results as they relate to student satisfaction after the SWPBS program was implemented. How to calculate %: X/Y *100 # of students who rated 1/total # of student * 100

Calculate the % Worst Best Rating Scale Score 1 2 3 4 5 # of students before program start 70 85 40 30 # of students 3 months after program start 25 45 95 15 # of students 6 months after program start 10 35 115 50 What % of students rated their school experience at the highest versus lowest level during each condition (before program, 3 months after program start, 6 months after program start)? Total # of students surveyed = 225

Elements of a Program

Program Design Program design refers to identifying and defining the elements that goes into the delivery of a service.

Inputs Inputs to a program include five elements representing an agency's resources and raw materials. Client and Consumers Staff Physical Resources: Materials Facilities Equipment

Throughputs Procedures that will be implemented to carry out the program The more traditional systems theory language uses the term throughputs in place of activities. Service Definition Brief definition of the services to be provided Narrows down the service Service Tasks Helps to define the activities that go into the provision of the service Brings clearer definition of: Who does what with clients For what purpose Under what conditions

Throughputs Cont’d Methods of Intervention Explains the way that services will be delivered Includes words such as: teaching, facilitating, enabling Important to identify best practice

Outputs Direct products of program activities Number of sessions provided Number of seminars taught The purpose of measuring output is to determine how much of an available service a client actually received and whether the client completed treatment or received the full complement of services as specified in the program design. Units of Service Contact units: 1 contact between worker and client Material Units: tangible resource provided to the client Time Units: can refer to direct client contact/paperwork completion

Outputs Cont’d Service Completion Final input When will client be finished with the service? Defined at the time the program is designed Can be difficult to define depending on the service Defining output prior to implementation of a program also enables evaluators to distinguish between someone who completes the program and someone who drops out.

More on Outputs Quality Frequently addressed with the use of standards Must be well defined Melded with units of service and tracked

Outcomes Do client improve due to services? How do you define or measure improvement? Standardized Measures Have been validated Level of Functioning Scale Specific to the program of service Requires that practitioners rate their clients on several aspects of functioning. Client Satisfaction Client satisfaction scores provide a one-time statement of a client's perception of the usefulness of the services provided.

Outcomes Cont’d Intermediate Outcomes Final Outcomes Changes in the quality of life if the client measured at the completion of service Final Outcomes Changes in the clients life as measured in follow up

Impact Measurable change occurring in organizations, communities, or systems as a result of services. It is always possible to aggregate data, but not possible to disaggregate.

Program Design Evaluation Look at example in textbook Pg 154 What are the strengths and weaknesses of each? With consideration of all the different aspects of programs we have just discussed which program would you choose? Why?