A School-Based Reading Program Evaluation

Slides:



Advertisements
Similar presentations
Progress Monitoring: Data to Instructional Decision-Making Frank Worrell Marley Watkins Tracey Hall Ministry of Education Trinidad and Tobago January,
Advertisements

Assessing Student Performance
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Critical Reading Strategies: Overview of Research Process
Potential impact of PISA
Data Analysis Training Objectives 1.Understand the purpose of interpreting and analyzing data 2.Learn and use general terminology associated.
A Model Description By Benjamin Ditkowsky, Ph.D. Student Growth Models for Principal and Student Evaluation.
TAP Performance Pay Incentives A Basic Overview 1 Fall 2012.
Instructional Technology vs. Educational Technology
Inferential Statistics
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
Analyzing the Reading Gap: Using MANOVA with discriminant group design to explore reading differences between young males and females.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
1 Referrals, Evaluations and Eligibility Determinations Office of Vocational and Educational Services for Individuals with Disabilities Special Education.
A NEW APPROACH TO IDENTIFYING LEARNING DISABILITIES RTI: Academics.
Doing Social Psychology Research
Problem Identification
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Chapter 2 Research Methods. The Scientific Approach: A Search for Laws Empiricism: testing hypothesis Basic assumption: events are governed by some lawful.
Support for Writing Grants at UNL  SSP Core Facility  Office of Research February 22, 2008.
1 of 27 PSYC 4310/6310 Advanced Experimental Methods and Statistics © 2013, Michael Kalsher Michael J. Kalsher Department of Cognitive Science Adv. Experimental.
Basic Concepts of Research Basis of scientific method Making observations in systematic way Follow strict rules of evidence Critical thinking about evidence.
Chapter 2 Research Methods. The Scientific Approach: A Search for Laws Empiricism: testing hypothesis Basic assumption: events are governed by some lawful.
Freedom to Learn Michigan’s One-to-One Teaching & Learning Initiative Michigan Technology Directors’ Conference October 6, 2006 Leslie Wilson & Scott Wooster.
Becoming a Teacher Ninth Edition
Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa,
The Role of Automation in Undergraduate Computer Science Chris Wilcox Colorado State University 3/5/2015.
The Pershing/Rice University Math Partnership: PUMPing Up Math Achievement Caren Grant Pershing MS, Houston ISD
Meryle Weinstein, Emilyn Ruble Whitesell and Amy Ellen Schwartz New York University Improving Education through Accountability and Evaluation: Lessons.
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
T tests comparing two means t tests comparing two means.
Education 793 Class Notes Welcome! 3 September 2003.
The Research Enterprise in Psychology. The Scientific Method: Terminology Operational definitions are used to clarify precisely what is meant by each.
ARROW Trial Design Professor Greg Brooks, Sheffield University, Ed Studies Dr Jeremy Miles York University, Trials Unit Carole Torgerson, York University,
Measuring of student subject competencies by SAM: regional experience Elena Kardanova National Research University Higher School of Economics.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Evaluation of After School Programs Denise Huang CRESST Conference September 8th, 2005.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
From Screening to Verification: The RTI Process at Westside Jolene Johnson, Ed.S. Monica McKevitt, Ed.S.
MELS 601 Ch. 7. If curriculum can be defined most simply as what is taught in the school, then instruction is the how —the methods and techniques that.
CT 854: Assessment and Evaluation in Science & Mathematics
Evaluation of Shoreline Science Jia Wang & Joan Herman UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation,
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
A Closer Look Quality Goals Appropriate Assessments.
A Formative Assessment System That Really Works Lee Ann Pruske, MTS Kim O’Brien, MTL Milwaukee.
Passport to Science MSP Science Program Indianapolis Public Schools.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Testing Overview for Parents October 22, What Tests Will My Child Take? District 11 Testing Chart.
By: Jill Mullins. RtI is… the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time and.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Response to Intervention (RtI) Aldine ISD District Staff Development August 18, 2009.
Colorado Accommodation Manual Part I Section I Guidance Section II Five-Step Process Welcome! Colorado Department of Education Exceptional Student Services.
Research and Evaluation
Chapter 2 Research Methods.
Literature Referenced Relationship of Variables
Issues in Evaluating Educational Research
What is Value Added?.
Instructor’s manual Mass Media Research: An Introduction, 7th Edition
Research Question and Hypothesis
Exploring the Role of Cultural and Policy Context in Distributed Leadership Practices in the US and Denmark The Comprehensive Assessment of Leadership.
Transforming Grading Robert Marzano
Diagnosis and Remediation of Reading Difficulties
Week 3 Class Discussion.
Response to Intervention (RtI) What is a Teacher’s Role?
CHAPTER 12: Assessing Reading Achievement
NAEP and International Assessments
Seaford School District
Instructions for Using the Program Evaluation Toolkit
Presentation transcript:

A School-Based Reading Program Evaluation Michael F. Lewis, Ph.D. Niagara Falls City School District

Changing Education Funding Current changes in educational funding New federal grant opportunities Private grant opportunities This funding often requires increased accountability

Accountability Landscape in the U.S. has shifted to mandated levels of accountability in many areas due to recent events Deceptive accounting in business ENRON Abuse of school district budgets Roslyn School District (New York)

Accountability Recently, when funding is awarded accountability is achieved through program evaluation program evaluation required by independent evaluators Mandated compensation (7%) Strict accounting of expenditures The budget is fixed

What is Program Evaluation? Definition: a formalized approach to studying the goals, processes, and impacts of projects, policies and programs can involve quantitative or qualitative methods of social research (or both) People who do program evaluation come from many different backgrounds: sociology, psychology, economics, social work

Types of Program Evaluation A needs assessment examines the nature of the problem that the program is meant to address The program theory is the formal description of the program's concept and design Process analysis evaluates how the program is being implemented The impact evaluation determines the causal effects of the program Cost-effectiveness analysis assesses the efficiency of a program.

Impact Evaluation Impact evaluation is the most common form of program evaluation Determines (as best as possible) the effects of a particular program along some criteria DARE (on decreasing drug use) PBIS (on reducing negative behaviors in school) RTI (on reducing # of students identified as LD)

Program Evaluation can be both or either Evaluation as Research As conceptualized by: Stephen Truscott, Psy.D. (Georgia State University) Big ‘R’ research University Run For Publication Large Scale Little ‘r’ research Individually Run For Information Small Scale Program Evaluation can be both or either

The role of the School Psychologist We do EVERYTHING!!! SP has evolved: We are now Jack (and mostly Jill) of all trades We were special education evaluators We are now increasingly responsible for many activities related to General Education Including evaluation of programs

Program Evaluation In Action Niagara Falls School District evaluation of: Fast ForWord (FFWD) Computer-based Auditory Processing and Literacy Skills Timed ‘protocols’ for going through program 50, 90, and 120 minute protocols The Problem: FFWD has never appeared in peer-reviewed literature

The Problem (continued) Fast ForWord Proprietary program No real scientific evaluation of program How do we know it is really effective? We do a program evaluation on student’s who use FFWD…

FFWD Evaluation The Subject The Evaluation Administration of FFWD to entire 2nd grade 50 minute protocol (run every day) The Evaluation Pre-test/Post-test design Evaluate every 2nd grader in reading Analyze findings for increase in reading scores

FFWD Evaluation To evaluate you must have a measure The Measure: GRADE Group Reading Assessment (and) Diagnostic Evaluation Standardized, norm-referenced

FFWD Evaluation To evaluate you must have a measure The other Measure: DRA Diagnostic Reading Assessment (This is a tool we use locally for Reading level)

FFWD Evaluation The Procedure: Every 2nd grader participated Daily 50-minute FFWD protocol Ran for 20 weeks Took GRADE before and after FFWD Classroom instruction did not change

FFWD Evaluation The Procedure (a summary) 1) GRADE pre-test 2) 20 weeks of FFWD (50 minute protocol) 3) GRADE post-test 4) Score and analyze GRADE results 5) Determine effects of FFWD on reading

FFWD Evaluation Statistics: Imported GRADE data into SPSS SPSS: Statistical Package for Social Sciences Computed Paired Samples t-tests on all students with pre/post GRADE data Same computation for DRA levels This determined statistical significance Not the whole story…

FFWD Evaluation Statistics: Statistical Significance vs. Effect Size Effect size determines a measurable degree of the statistical significance Effect size reported in standard deviation form Evaluation can have statistical significance but a small Effect Size

FFWD Findings Paired Sample t-Test

FFWD Findings Paired Sample t-Test

FFWD Findings Statistical Significance vs. Effect size With a large sample it is highly likely that even a small change will indicate a statistical significance ≈380 students is a large sample size

These are considered small effect sizes FFWD Findings Effect Size takes more realistic look at actual increase of significant findings Hedge’s G: one way to calculate effect size g = t √(n1 + n2) / √(n1n2) or g = 2t / √ N Effect Size findings: Vocabulary: g = .27 Comprehension: g = .35 Total Test: g = .33 These are considered small effect sizes

Errors Inefficiency Evaluation Design No need for testing of all students Maintain statistical meaning w/ smaller random sample this was ignored by district administration Administration Testing all students reduces control of standard administration Evaluation Design No Control Group No way of determining if FFWD caused increased GRADE scores w/out control group

???Questions??? Program Evaluation Reference: Contact: Posavac, E. & Carey, R. (2006). Program Evaluation: Methods and Case Studies (7th Ed.). Prentice Hall, New York, NY. Contact: mlewis@nfschools.net