Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New.

Slides:



Advertisements
Similar presentations
Developing a Questionnaire
Advertisements

Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 2 Bruner Foundation Rochester, New.
How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 1: EVALUATION BASICS Anita M. Baker, Ed.D. Evaluation.
Phillips Associates 1 Capturing Elusive Level 3 Data: The Secrets Of Survey Design Session: SU116 Capturing Elusive Level 3 Data: The Secrets Of Survey.
PROJECT OVERVIEW Team Star Darrius, LaCresia, Ahmed.
Training Objectives.
June 10-15, 2012 Growing Community; Growing Possibilities Christine Doherty, Stanford University Lydia Li, Stanford University Lynn Ward, Indiana University.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Breakdowns and Question Types
Title 4 Effects of a student designed multimedia project on 8th graders attitude and performance.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
1 Session 8. Understanding the Problems Associated with Medicine Use— Qualitative Methods Drug and Therapeutics Committee.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
How to Write Goals and Objectives
Chapter 12 Developing Strategies for Objective Tests.
Assessing Financial Education: A Practitioner’s Guide December 2010.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
WELCOME Thank you for joining our webinar. We will begin at 1pm Eastern. Please call into the audio portion of our presentation: Dial If you.
NCHRP (47) - MTAP Survey Tool Used to Assess FTA Contractor Performance of State DOT Triennial and Other FTA Reviews - An Update Biennial FTA State.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Phillips Associates Learn 11 Surprising Techniques for Obtaining Powerful Data From Level 1 Evaluations Learn 11 Surprising Techniques for Obtaining Powerful.
Identifying the gaps in state assessment systems CCSSO Large-Scale Assessment Conference Nashville June 19, 2007 Sue Bechard Office of Inclusive Educational.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
What is MindSet? It is a training curriculum that is efficient and effective in creating and maintaining the safest possible environment, both emotionally.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
4/16/07 Assessment of the Core – Science Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Working Definition of Program Evaluation
My Resource for Excellence. Canadian Heritage Information Network Creation of the Collections Management Software Review (CMSR) Heather Dunn, CHIN.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Improving the Teaching of Academic Genres in High-Enrollment Courses across Disciplines: A Three-Year Reiterative Study Chris Thaiss University of California,
1-1 Copyright © 2015, 2010, 2007 Pearson Education, Inc. Chapter 1, Slide 1 Chapter 01 Stats Starts Here.
8 Identifying Market Segments and Targets
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Biostatistics Computing Project by Samy Uguy fall 2003.
Field Test of Counselor System July 2000 Alabama Professional Education Personnel Evaluation Program.
Monitoring SAT Participation and Performance. SAT/ACT Performance Targets Combined Critical Reading, Mathematics, and Writing SAT Score Targets
How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 3: SPECIFIC STRATEGIES Anita M. Baker, Ed.D. Evaluation.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
AN OVERVIEW OF THE CHILD OUTCOMES SUMMARY RATING PROCESS 1 Maryland State Department of Education - Division of Special Education/Early Intervention Services.
Targeted Behavior Interventions Data Driven Decision Making for Targeted Behavior Support Cohort 5 December Page 30.
Assessment and Testing
Survey Solutions An Introduction to Surveying. What Is a Survey? A survey is a series of questions asked of a group of people in order to gain information.
Summary Statements. The problem... Progress data included –5 progress categories –For each of 3 outcomes –Total of 15 numbers reported each year Too many.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
This action-based research study used a descriptive triangulation process, which included quantitative and qualitative methods to analyze nursing students’
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Program Evaluation Making sure instruction works..
1 We Changed the COSF The Flip The Skills The Decision-Tree Questions.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 2 Bruner Foundation Rochester, New.
Anita M. Baker, Ed.D. Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 1 Bruner Foundation Rochester, New York.
Anita M. Baker, Ed.D. Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New York.
Anita M. Baker, Ed.D. Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New York.
Survey Training Pack Session 3 – Questionnaire Design.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Simple Surveys: From Program Objectives to Impact Statements in 90 Minutes 2016 Extension Conference Jeff Buckley & Jennifer Cantwell January 13, 2016.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Data Management Alternatives. Available Options Online data collection tool (available in August) Data management system formerly known as AFI 2 Other.
Teaching Self-Determination to students with Emotional/Behavioral Disabilities Beth Ackerman, Ed.D. CCBD International Conference October 3, 2003.
After-Session Actions
Middle School Training: Ensuring a Strong Foundation of Supports
An Overview of the EQuIP Rubric for Lessons & Units
Module 11 Developing Strategies for Test Taking
Finding Answers through Data Collection
Prepared by: Toni Joy Thurs Atayoc, RMT
Interim Assessment Training NEISD Testing Services
Cuyamaca College Library
Understanding Your PSAT/NMSQT Results
Presentation transcript:

Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New York

Evaluation Support 2.0 Sponsored by the Bruner Foundation and Evaluation Services Free evaluation training and technical assistance focused on development of evaluative capacity including data analysis and reporting.  Four (4), on-site, hands-on training sessions.  Introduction to and use of free/low-cost tools to facilitate data entry, management and analysis.  Guided evaluation project required.  Virtual conference with funder, other organization participants. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

E-Surveys – Key Decisions  Why use an e-survey rather than a hard-copy survey/ intercept survey/ alternative survey or other data collection strategy?  What Question types do you need?  How will they be displayed?  Do you need an “other” field?  Should they be “required?” Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

Multiple Choice (only 1 answer) Forced Choice Item (MO) Directions read: Mark One – unless it is so obvious that is the expectation. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 1

Multiple Choice (multiple answers) Multiple Response Item (MATA) Multiple response items often create analysis challenges. Use sparingly. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 2

Comment/Essay Box/Open-ended Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 3

Matrix of Choices (1 answer/row vs. multi answers/row) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 4

Likert/Rating Scale Not at all important Slightly important Somewhat important Moderately important Extremely important Quality of ingredients Flavor Texture Brand A true Likert scale has 5 answer choices, and by the way it is pronounced Lick –ert not Like –ert. (The strategy was named for Rensis Likert who invented or popularized them.) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 5

Single Vs. Multiple Textboxes You must have an analysis plan for using these data. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 6

Numerical Textboxes You must have an analysis plan for using these data. Consider hard codes (e.g., 1/month, at least 6 times per year, etc. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 7

8 Analyzing Data Using Survey Monkey Key Skills Generating and using frequencies Generating and customizing graphics Using Filters Comparing data Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

9 Analyzing Data Using SM Exporting and Merging Data Export All - All Summary Data (PDF – which can be customized) Export All - All Response Data (Excel or SPSS, Zipped File - sheet 1 has data) Merging Data – connecting data from a survey monkey file with other data using a common key Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

Survey Result Example:Default BEC Session 2 - January 2012 How would you rate Session 2 overall? Answer Options Response Percent Response Count Not So Good 00 Okay 15%5 Very Good 71%24 Excellent 15%5 answered question34 skipped question1 Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 10

SURVEY RESULT EXAMPLE Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 11

Survey Result: Crosstab Example % of Freshman who... Peer Study GroupTotal Yes n=232 No n=247N=479 Reported struggling to maintain grades 36%58%47% Are planning to enroll for the sophomore year at this school 89%72%80% Disaggregated Data Note: A total of 1000 Freshmen were enrolled ; 502 were in Peer Study Groups, 498 were not. About half of the students who were in study groups and half who were not responded to the survey. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 12

Cross-tab Considerations  Decide whether to use row or column percents to present your data. The calculations answer different questions.  Do you want to know for each program group (lunch time v. afterschool) how many got high scores (vs. low or medium) on their final demonstration test. OR  Do you want to know which type of program the highest scorers participated in. TIP: Pick a strategy and use it consistently. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 13

Column Percents Lunch Group Afterschool Group TOTAL N=300n=300N=600 Low Scores50%33%42% Medium Scores17% High Scores33%50%42% Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 14 Half of those in the afterschool group had high scores, 17% had medium scores and the rest had low scores. Only about 1/3 of those in the lunch group had high scores, 17% had medium scores and 50% had low scores..

Row Percents Lunch Group Afterschool Group N=300 Low Score60%40% Medium Scores50% High Scores40%60% TOTAL50% Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 15 More than ½ (60%) of those with high scores were in the afterschool group

Definitive Statements Percent of Training Participants (N=93) who Think AAV Helped or Will Help Them:  SomeA LotTOTAL Discuss issues of violence with clients45%55%100% Access additional strategies for self-care/stress reduction47%51%98% Provide positive interventions for clients32%65%97% Understand the importance of self-care/stress reduction38%58%96% Offer clients new ways to: De-escalate Situations31%67%98% Manage Anger54%43%97% Do safety planning45%52%97% Conduct Bystander Interventions39%58%97% Target = 50% or more say “a lot” to each AAV met its targets for all areas of interest except Management of Anger More than 95% of participants thought AAV helped or will help them with each of the focus areas of the training. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 16

Definitive Statements % of Freshman who... Peer Study GroupTotal Yes n=212 No n=257N=479 Reported struggling to maintain grades36%58%47% Are planning to enroll for the sophomore year at this school 89%73%80% Only about 1/3 of freshman in peer study groups reported struggling to maintain their grades compared to ½ of those not in study groups. Proportionately more study group participants are planning to enroll for sophomore year. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 17

Pre/Post Surveys: A Closer Look Pre Survey Post Survey Post Survey – Pre Survey = RESULT INTERVENTION Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 18

Pre-Post Surveys Effective Use 1. Unique identification system for matching (preferably user generated) 2. Brief, well-constructed survey (with items connected to intervention) 3. Careful mix of items ** Knowledge, Attitudes **Behaviors 4. Set targets (Pre, Post, Change, Match) 5. Successful pre-administration to all/sample of participants Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 19

Pre-Post Surveys Challenges 1. Respondent unfamiliar with terminology (pre-test) 2. Respondent answers falsely(social desirability) 3. Pre-measures show existing knowledge, or desired attitudes or behaviors 4. Substantial data loss (pre without post, post without pre) 5. Pre-post change is small or varied 6. Change is large enough and in desired direction but alternative explanations exist Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 20

Pre-Post Surveys Alternatives 1. Post Only (compare results to targets) 2. Retrospective Survey 2 Questions for each item: Post First: Ask about behavior after Then Pre: Ask about behavior before Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services 21

24 Location of M.O.B. Participants, Falls Prevention Program Year 1 ( ) 22 Bruner Foundation Rochester, New York