Evidence-Based Education (EBE) Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement United States Department of Education.

Slides:



Advertisements
Similar presentations
Educational Psychology Fourth Edition
Advertisements

Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Introduction to Psychology
Responsiveness to Instruction North Carolina Problem Solving Model Problem Solving Model Session 1/4.
School Psychology and Accountability K. T. Hinkle, Ph.D., NCSP Valdosta State University InternetLoggingSystem.com: A Technological Tool for Improving.
Monitoring and Evaluation of ICT Use in Rural Schools Ok-choon Park Global Symposium on ICT in Education – Measuring Impact: Monitoring and Evaluation.
Postgraduate Course 7. Evidence-based management: Research designs.
Improving the Intelligence of Assessment Through Technology: An IES Perspective Martin E. Orland, Ph.D. Special Assistant Institute of Education Sciences.
From Dept. of Ed website Dated bytes - Fri Mar 15 12:15:53 EST 2002.
Treatment Evaluation. Identification Graduate and professional economics mainly concerned with identification in empirical work. Concept of understanding.
1 Maine’s Impact Study of Technology in Mathematics (MISTM) David L. Silvernail, Director Maine Education Policy Research Institute University of Southern.
Experimental designs Non-experimental pre-experimental quasi-experimental experimental No time order time order variable time order variables time order.
1 COABE 2003 AALPD session: Bingman, Joyner and Smith. April 29, 2003, Portland, Oregon What is the role of professional development in evidence-based.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
Chapter 3 Flashcards. obligation of an individual to other individuals based on a social or legal contract to justify his or her actions; the processes.
TWS Aid for Scorers Information on the Background of TWS.
Who are the participants? Creating a Quality Sample 47:269: Research Methods I Dr. Leonard March 22, 2010.
1 ETR 520 Introduction to Educational Research Dr. M C. Smith.
Concept of Measurement
Evaluating Hypotheses Chapter 9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics.
Between a Rock and a Hard Place Scientifically Based Evidence in Evaluation Presentation to the Canadian Evaluation Society Vancouver, B.C. June 3, 2003.
Evaluating Hypotheses Chapter 9 Homework: 1-9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics ~
Computer Based Technology Used by Faculty Members at Vineland High School South Diane C. Stokes Thesis – Fall 2004/Spring 2005 Dr. Shontz.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
ESOL in Adult Education Academic Session: Perspectives on the Adult Immigrant Experience Today Teachers of English to Speakers of Other Languages Convention.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Critical Appraisal of Clinical Practice Guidelines
Introduction to Educational Research Dr. M C. Smith
Assessment Group for Provincial Assessments, June Kadriye Ercikan University of British Columbia.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Research Methodology For IB Psychology Students. Empirical Investigation The collecting of objective information firsthand, by making careful measurements.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Systematic Reviews.
ARROW Trial Design Professor Greg Brooks, Sheffield University, Ed Studies Dr Jeremy Miles York University, Trials Unit Carole Torgerson, York University,
Chapter 3 Research Methods Used to Study Child Behavior Disorders.
Module 4 Notes Research Methods. Let’s Discuss! Why is Research Important?
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 1. The Statistical Imagination.
Scientifically-Based Research What is scientifically-based research? How do evaluate it?
Why is Research Important?. Basic Research Pure science or research Research for the sake of finding new information and expanding the knowledge base.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
HYPOTHESIS TESTING Null Hypothesis and Research Hypothesis ?
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Teaching Students with Mild and Moderate Disabilities: Research-Based Practices Second Edition © 2009 Pearson Education, Inc. All rights.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Systematic Review: Interpreting Results and Identifying Gaps October 17, 2012.
Fall 2009 Dr. Bobby Franklin.  “... [the] systematic, controlled empirical and critical investigation of natural phenomena guided by theory and hypotheses.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Response to Intervention: Introduction Connecting Research to Practice for Teacher Educators.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
EVALUATING u After retrieving the literature, you have to evaluate or critically appraise the evidence for its validity and applicability to your patient.
What Works And Is Hopeful Grover J. (Russ) Whitehurst, Ph.D. Director Institute of Education Sciences United States Department of Education About High.
BHS Methods in Behavioral Sciences I May 9, 2003 Chapter 6 and 7 (Ray) Control: The Keystone of the Experimental Method.
What We Know About Effective Teaching and Teacher Preparation Jack States Randy Keyworth The Wing Institute.
© 2009 McGraw-Hill Higher Education. All rights reserved. Educational Psychology 5 th Edition by John W. Santrock PowerPoint Presentation to accompany.
Outcomes for the ESA 6 Regional In-Service “What Works in Schools” Gain an understanding of … Dr. Marzano’s, “What Works in Schools.” the eleven factors.
Producing Data 1.
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
STEPS IN RESEARCH PROCESS 1. Identification of Research Problems This involves Identifying existing problems in an area of study (e.g. Home Economics),
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
USING NATIONAL GUIDELINES FOR SCREENING, TREATMENT, AND FOLLOW-UP
PBIS PRACTICES.
Data Literacy Survey results and Data Protocols
The Efficacy of Student Evaluations of Teaching Effectiveness
Introduction to Experimental Design
Presentation transcript:

Evidence-Based Education (EBE) Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement United States Department of Education Archived Information

Three Stories The university president –Evidence isn’t relevant The vendors –What constitutes evidence isn’t clear Teaching –Evidence isn’t available

What is EBE? The integration of professional wisdom with the best available empirical evidence in making decisions about how to deliver instruction

What is professional wisdom? The judgment that individuals acquire through experience Consensus views Increased professional wisdom is reflected in numerous ways, including the effective identification and incorporation of local circumstances into instruction

What is empirical evidence? Scientifically-based research from fields such as psychology, sociology, economics, and neuroscience, and especially from research in educational settings Empirical data on performance used to compare, evaluate, and monitor progress

Evidence-based Education

Why are both needed? Without professional wisdom education cannot –adapt to local circumstances –operate intelligently in the many areas in which research evidence is absent or incomplete. Without empirical evidence education cannot –resolve competing approaches –generate cumulative knowledge –avoid fad, fancy, and personal bias

Medicine and Ag as Models A little history Evidence-based medicine Examples –The Illinois Library –The FTC and diet pills –The Hormone Replacement Therapy Study

The HRT Study Sample: 27,000+ Women, aged Research Design: Women randomly assigned to receive either hormone therapy or a placebo; Data collected for 8-12 years. Hypothesis: HRT will reduce heart disease and fractures without increasing breast cancer

The HRT Study

Social Policy and ED examples Nurse-home visitation DARE High quality preschool National Reading Panel report

Policy Requirements Difference in the mix of professional judgment, scientific research, and objective measures that justifies imposition of requirements contrasted with identification as good practice Reading research vs. math research as example

Scientifically Based Research “…means research that involves the application of rigorous, systematic, and objective procedures to obtain reliable and valid knowledge relevant to education activities and programs” (No Child Left Behind Act of 2001)

Scientifically Based Research Quality –To what degree does the design and analysis and logical inference support the claims and conclusions? Relevance –To what degree are the variables and circumstances similar across the research and the settings in which the research is to be applied?

Quality: Levels of evidence All evidence is NOT created equal 1.Randomized trial (true experiment) 2.Comparison groups (quasi-experiment) 3.Pre-Post comparison 4.Correlational studies 5.Case studies 6.Anecdotes

Randomized Trials: The gold standard Claim about the effects of an educational intervention on outcomes Two or more conditions that differ in levels of exposure to the educational intervention Random assignment to conditions Tests for differences in outcomes

Why is randomization critical? Assures that the participants being compared have the same characteristics across the conditions Rules of chance mean that the smart, motivated, experienced, etc. have the same probability of being in condition 1 as in condition 2 Without randomization, differences between two conditions may result from pre-existing difference in the participants and subtle selection biases

Why is randomization critical? Average science scores by students' reports on use of the Internet at home Without randomization, simple associations such as between internet use and science grades have many different interpretations

Relevance Does the study involve a similar intervention and outcome to those of interest? Were the participants and settings representative of those of interest?

Evidence will not make the decision Be skeptical Consider other ways of achieving goal Consider consequences and local circumstances Consult with experts who understand evidence before making costly decisions (This is different from consulting authorities who may know the subject area but not rules of evidence)

EBE -- Where are we?

What ED will do The What Works Clearinghouse (w-w-c.org) –interventions linked to evidentiary support –systematic reviews –standards for providers of evaluations, and list of evaluators who have agreed to follow those standards

What ED will do The National Center for Education Evaluation –Well designed, timely, & nonpartisan evaluations of ED’s own programs Funding streams Specific interventions –Funding for development and evaluation of interventions in the field –Feedback into discretionary grant programs

What ED will do Internal review of ED’s own products Build capacity in the field –Professional training –Workshops for major decision makers Systematic and long-term research programs to fill gaps

Goals ED will provide the tools, information, research, and training to support the development of evidence-based education The practice of evidence-based education will become routine Education across the nation will be continuously improved Wide variation in performance across schools and classrooms will be eliminated