Program Evaluation Luke’s Perspective. Ideal Framework Context: What needs to be done? (before) Input: How should it be done? (before) Process: Is it.

Slides:



Advertisements
Similar presentations
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Advertisements

What is a Survey? A scientific social research method that involves
REVIEW OF QUALITATIVE RESEARCH AND PRINCIPLES OF QUALITATIVE ANALYSIS SCWK 242 – SESSION 2 SLIDES.
Class Meeting #4 Qualitative Methods Measurement Tools & Strategies.
Research methods – Deductive / quantitative
There is no reason to pay close attention to this unless you are going to conduct a proposal for a needs assessment.
The Purpose of Action Research
RtI Day 2 EXCEED Trainer of Trainers SDUSD October 2011 Linda Trousdale Michelle Crisci Several slides were adapted from: Washoe County School District,
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
Chapter 13: Descriptive and Exploratory Research
Historical Research.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Observing Learning Helen Bacon and Jan Ridgway Inclusion Support Services.
Evaluation. Practical Evaluation Michael Quinn Patton.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
3 Methods for Collecting Data Mgt Three Major Techniques for Collecting Data: 1. Questionnaires 2. Interviews 3. Observation.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
The Research Process. Purposes of Research  Exploration gaining some familiarity with a topic, discovering some of its main dimensions, and possibly.
UOFYE Assessment Retreat
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
Session Materials  Wiki
Session Materials Wireless Wiki
Business and Management Research
Community Input Discussions: Measuring the Progress of Young Children in Massachusetts August 2009.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
Prof. A. Taleb-Bendiab Room 605 A. Taleb-Bendiab, Module: Research Methods,
Adolescent Literacy – Professional Development
Washington State Teacher and Principal Evaluation 1.
Collecting data for Monitoring and Evaluation Purposes Dr. Fred Mugambi Mwirigi JKUAT.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Copyright : Hi Tech Criminal Justice, Raymond E. Foster Public PolicyPublic Policy and Practice in Criminal Justice Criminal Justice Public.
 How were Welles’ actions on 9/11 symbolic of American values and beliefs?  His acts were selfless  He sacrificed  He exercised extreme bravery and.
Chapter 11: Qualitative and Mixed-Method Research Design
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
 Collecting Quantitative  Data  By: Zainab Aidroos.
Evaluating a Research Report
Data Collection Methods
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
TAH Project Evaluation Data Collection Sun Associates.
Educational Action Research Todd Twyman Summer 2011 Week 1.
1: Overview and Field Research in Classrooms ETL329: ENTREPRENEURIAL PROFESSIONAL.
Qualitative Research Designs Day 4 The Curious Skeptics at work.
Why is research important Propose theories Test theories Increase understanding Improve teaching and learning.
Quantitative and Qualitative Approaches
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
Educational Action Research Todd Twyman Summer 2011 Week 1.
Aim: Review Session 1 for Final Exploratory Data Analysis & Types of Studies HW: complete worksheet.
Facilitate Group Learning
SOCIAL SCIENCE RESEARCH METHODS. The Scientific Method  Need a set of procedures that show not only how findings have been arrived at but are also clear.
Documenting Completion of your PDP
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Teaching & Inquiry I: Fundamentals of Teaching Through Inquiry Todd Twyman 9/18(20)
Paper III Qualitative research methodology. Objective 1.2 Explain strengths and limitations of a qualitative approach to research?
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
Monitoring and Evaluation in the GMS Learning Program 7 – 18 May 2012, Mekong Institute, Khon Kaen, Thailand Randy S. Balaoro, CE, MM, PMP Data Collection.
Evaluation and Assessment of Instructional Design Module #4 Designing Effective Instructional Design Research Tools Part 2: Data-Collection Techniques.
Qualitative Research Methods Interviews Alexandra Bousiou (School of Public Administration)
Computing Honours Project (COMP10034) Lecture 4 Primary Research.
Russell & Jamieson chapter Evaluation Steps 15. Evaluation Steps Step 1: Preparing an Evaluation Proposal Step 2: Designing the Study Step 3: Selecting.
Developing Community Assessments
Planning my research journey
DATA COLLECTION METHODS IN NURSING RESEARCH
Safety Culture Self-Assessment Methodology
3 Methods for Collecting Data
Case studies: interviews
Presentation transcript:

Program Evaluation Luke’s Perspective

Ideal Framework Context: What needs to be done? (before) Input: How should it be done? (before) Process: Is it being done? (during) Product: Is it succeeding? (during/after) Real world: We focus on the Product first, and then work backward. Example – NCLB / AYP

Case Study: New York City Purpose of the program evaluation: – Identify why schools are failing (NCLB/AYP background) Expected program evaluation outcome: – list of findings and recommendations to remedy the situation – organized by curriculum, assessment, and professional development (arrived at through covisioning)

Mixed Method Qualitative – Stakeholders identified as district level, principals, teachers, and aides – Stratified random interviews – Complete stakeholder focus groups – Key document review – Recorded, coded, emergent themes identified, data analyzed, data summarized Quantitative – Teacher/classroom observations (random mts) – Student performance data (tests) – Surveys of Enacted Curriculum (SEC) (next)

Review: Two Major Approaches Qualitative Attempts to figure out why things are the way they are, to describe cultures, groups, events Microscopic look at phenomena Quantitative Attempts to figure out the extent to which something is related to something else; to generalize to a larger population Wide-angle look at phenomena In program evaluation we often want both, we want the specific, but also need the big picture

New York Exemplar SEC instrument and results (next slides) Sample methodology Sample report Sample data collection Sample data analysis Final report

SEC

Use links on website for more examples

What is Program Evaluation In a program evaluation we are interested in evaluating outcomes - at the program level. We are not as interested in individual student performance as we are interested in how well a program drives student performance. In schools, student outcomes are usually aligned with goals, objectives, and performance standards. There are many paths to evaluating the relative worth of a program, but a nice way to conceptualize these alternative methods is on a continuum from absolutely qualitative to absolutely quantitative.

Program Evaluation for You Critical component of Program Evaluation is data triangulation toward some purpose You might choose to get info from… – Classroom curriculum (akin to action research) – Schoolwide program (behavior? Technology?) – District initiatives

Brainstorm (this isn’t a commitment, just a start) Take 5 mins and make a list… – What program might you be interested in evaluating? (can be class, school, district) – What are the key variables?

Brainstorm (this isn’t a commitment, just a start) Take 5 mins and make a list… – What program might you be interested in evaluating? (can be class, school, district) – What are the key variables? – Share in groups of 2-3

Review Program evaluation is what it sounds like – evaluating something at the program level, and often calls for mixed methods No matter what you choose to evaluate, you will likely want different sources of data

Example Data Sources Performance Attendance Behavior Fiscal Professional development participation Parent survey data

Conceptual Dimensions of Data sources of question making 1.Organization – How does the school function? Where do we put our money? 2.Demographics – Who are we? Who is here? 3.Academics – How do students perform? 4.Conation – What is school climate like? 5.Behavior – How do students misbehave? 6.Time – How does it change over time?

Conceptual Model of Data Dimensions Organization Conation Behavior Academics Time Demographics

Example of a 4 Dimensional Question Demographics – who are the low SES students? Conation – which students like school? Academics – who scored above average in math? Organization – which students are in AP math? Question: In terms of SES, how do above average math students who report liking school distribute in tracked math classes? AD CO Program: School practice of tracking students

Example of a 4 Dimensional Question In terms of SES, how do above average math students who report liking school distribute in tracked math classes? BasicAdvancedAP Low SES High SES All of these students are above average math achievers who like school

Example of a 4 Dimensional Question BasicAdvancedAP Low SES High SES In terms of SES, how do above average math students who report liking school distribute in tracked math classes? Ultimately, we will want to track this data over time

Organization Demographics Academics Conation Behavior Time Example Question: Based on who we have as students and how they are being taught, what are the differences in the reading scores? Organization Demographics Academics Conation Behavior Time Program: Current teaching practice

Organization Demographics Academics Conation Behavior Time Common Example Question: Do different groups of students perform the same on standardized tests? Organization Demographics Academics Conation Behavior Time Program: District ‘closing the achievement gap’ initiative

Organization Demographics Academics Conation Behavior Time Better Example Question: Over the last five years, have different groups of students performed the same on standardized tests? Organization Demographics Academics Conation Behavior Time Program: District ‘closing the achievement gap’ initiative

Review Program evaluation is what it sounds like – evaluating something at the program level, and often calls for mixed methods 6 dimensions of data 6 common data collection methods

Collecting Data Six common methods – Surveys (questionnaires) – Interviews (1:1) – Document review (examine an organization’s literature) – Observations (watch something happening at an organization) – Focus groups (group interviews) – Data mining (using data that has already been collected) Break into 6 groups and identify advantages and disadvantages to each method. I’ll present my ideas. Everyone we agree with you get a point – to a maximum of 6.

Collecting Data Method: surveys, questionnaires, checklists Purpose: when need to quickly and/or easily get lots of information from people in a non threatening way Advantages – anonymous – inexpensive to administer – easy to compare and analyze – administer to many people / can get lots of data – many sample questionnaires already exist Disadvantages – might not get careful feedback – wording can bias client's responses – are impersonal – in surveys, may need sampling expertise – doesn't get full story – only the part you target

Collecting Data Method: interviews Purpose: when want to fully understand someone's impressions or experiences, or learn more about their answers to questionnaires Advantages – get full range and depth of information – develops relationship with client – can be flexible with client Disadvantages – can take much time – can be hard to analyze and compare – can be costly – interviewer can bias client's responses

Collecting Data Method: document review Purpose: when want impression of how program operates without interrupting the program; review of curriculum, texts, finances, memos, minutes, etc. Advantages – get comprehensive and historical information – doesn't interrupt program or routine in program – information already exists – few biases about information Disadvantages – often takes much time – info may be incomplete – you probably wont find everything – need to be quite clear about what looking for – not flexible means to get data; data restricted to what already exists

Collecting Data Method: observation Purpose: to gather accurate information about how a program actually operates Advantages – view operations of a school as they are actually occurring – real and accurate data! – can adapt to events as they occur Disadvantages – can be difficult to interpret seen behaviors – can be complex to categorize observations – Your presence can influence behaviors of program participants – can be expensive / time consuming

Collecting Data Method: focus groups Purpose: explore a topic in depth through group discussion, e.g., about reactions to an experience or suggestion, understanding common complaints, etc. Advantages – quickly and reliably get common impressions – can be efficient way to get much range and depth of information in short time – can convey key information about programs Disadvantages – can be hard to analyze responses – need good facilitator for safety and closure – one or two people can monopolize discussion – steer the discussion – difficult to schedule lots of people together

Collecting Data Method: data mining Purpose: use extant data to form a picture Advantages – Data does not need collecting – Usually easy to analyze – Tends to be reliable Disadvantages – Human rights – permission issues when you use data for another purpose – Gaining access to data sometimes difficult – Aggregating data / merging data sets sometimes difficult

Review Program evaluation is what it sounds like – evaluating something at the program level, and often calls for mixed methods 6 dimensions of data 6 common data collection methods 3 perspectives when answering questions

3 Perspectives (Referents) In measurement the question we often need to ask is, to what will we compare our data? There are three corresponding referents, This organization is not always perfect, but reminds us to address a referent in questioning and analysis Individual Norm Criterion

Norm Referenced

Criterion Referenced

Individually Referenced 5 th grade individual student report - ORF

Norm Criterion Individual Example Question: Compared to last year, what are the differences in student reading scores because of attitudes related to whom students have as teachers? Norm Criterion Individual

Norm Criterion Individual Check for Understanding Question: How does Luke like the new scheduling system in the middle school? Norm Criterion Individual

Check for Understanding Team up, describe a norm, criterion, and individual reference for the following…

Review Program evaluation is what it sounds like – evaluating something at the program level, and often calls for mixed methods 6 dimensions of data 6 common data collection methods 3 perspectives when answering questions The purpose leads to questions which become refined over time

Program Evalation Questions Program evaluation begins with a broad purpose – What are you going to evaluate and why? Program evaluation then moves to more refined measureable questions – Dictates the research design (the method) – Guides the gathering of data – Usually dictates the kind of analyses – Constrains the kinds of conclusions that can be drawn

The Prog Eval has a purpose Purpose What is the state of our curriculum in NYC? How is the curriculum broken? What are the good parts?

Questions are raised Purpose What is the state of our curriculum? How is the curriculum broken? What are the good parts? Questions What is curriculum? What is our curriculum? What does a good curriculum look like?

Questions are refined Purpose What is the state of our curriculum? How is the curriculum broken? What are the good parts? Questions What is curriculum? What is our curriculum? What does a good curriculum look like? Measureable Questions: (examples) At the elementary level, what is the quality of the taught curriculum measured by the SEC to teachers? At the middle-school level, to what degree is the social studies curriculum aligned with provincial standards?

Local Example Purpose Is the Coquitlam Master’s Program (CMP) succeeding? QuestionsMeasureable Questions What questions are relevant? How can we make them measureable?

Local Example Purpose Is the Coquitlam Master’s Program (CMP) succeeding? QuestionsMeasureable Questions: Alone or team up, 10 minutes then share a little

Team scenarios (optional) Read your scenario Identify the purpose and guiding questions for the program evaluation Identify measureable questions that cross data dimensions Identify where you will find the data, and to what you will compare it (referents). What data collection method will you use? Alone or team up, 20 minutes then share