Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.

Slides:



Advertisements
Similar presentations
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Advertisements

Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
Ies.ed.gov Connecting Research, Policy and Practice Accelerating the Academic Achievement of Students with Learning Disabilities Research Initiative Kristen.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
FY 2015 SSS Using Evidence to Support Proposed Strategies and Activities Competitive Preference Priorities 1(b) and 2(b)  Addressing priority(ies) in.
Preparing a Grant Proposal: Some Basics
High-Quality Supplemental Educational Services And After-School Partnerships Demonstration Program (CFDA Number: ) CLOSING DATE: August 12, 2008.
Reading the Dental Literature
First in the World 2015 Grant Competition Introductory Webinar April
Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
IES Grant Writing Workshop for Efficacy and Replication Projects
SKILLS FOR SUCCESS PROGRAM COMPETITION OVERVIEW—JUNE 2015 Note: These slides are intended as guidance only. Please refer to the official documents published.
Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides
Introduction to evidence based medicine
Tennessee Promise Forward Mini- Grant Competition Tennessee Higher Education Commission Informational Webinar.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Emily Lynn Grant Administrator Office of Sponsored Projects and Research Administration.
Developing Quality Grant Proposals U.S. Department of Education Center for Faith-Based and Community Initiatives
How to Critically Review an Article
School Leadership Program Pre-Application Slides United States Department of Education Office of Innovation and Improvement.
Overview Slides April 17, 2012 Q&A Webinar i3 Scale-up and Validation Applications Note: These slides are intended as guidance only. Please refer to the.
Selecting a Research Design. Research Design Refers to the outline, plan, or strategy specifying the procedure to be used in answering research questions.
Summary Document March 2010 Investing in Innovation (i3) Fund Note: These slides are intended as guidance only. Please refer to the official notice of.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
DRAFT – Not for Circulation Investing in Innovation (i3) 2012 Development Competition Summary Document February 2012 Note: These slides are intended as.
Overview Slides March 13, 2012 Q&A Webinar i3 Development Pre-Application Note: These slides are intended as guidance only. Please refer to the official.
Professional Development for Arts Educators Program (PDAE) Pre-Application Webinar U.S. Department of Education Office of Innovation and Improvement Improvement.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Presenters: Martin J. Blank, Martin J. Blank, President, Institute for Educational Leadership; Director, Coalition for Community Schools S. Kwesi Rollins.
Arts in Education Model Development and Dissemination Grant Program (AEMDD) Pre-Application Webinar U.S. Department of Education Office of Innovation and.
Evaluating a Research Report
STANDARDS OF EVIDENCE FOR INFORMING DECISIONS ON CHOOSING AMONG ALTERNATIVE APPROACHES TO PROVIDING RH/FP SERVICES Ian Askew, Population Council July 30,
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
NOTES FROM INFORMATIONAL BRIEFINGS FOR POTENTIAL REGIONAL CENTER AND CONTENT CENTER APPLICANTS JUNE 19,20 & 22, 2012 Comprehensive Centers Program.
Chapter 3 should describe what will be done to answer the research question(s), describe how it will be done and justify the research design, and explain.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.
AmeriCorps Grantee Training Evaluation and Research September 11, 2014.
Best Practices in Demonstrating Evidence Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Carla Ganiel, AmeriCorps State and National.
1.  Interpretation refers to the task of drawing inferences from the collected facts after an analytical and/or experimental study.  The task of interpretation.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Anatomy of a Research Article Five (or six) major sections Abstract Introduction (without a heading!) Method (and procedures) Results Discussion and conclusions.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
SOCW 671: #6 Research Designs Review for 1 st Quiz.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
OFFICE OF ENGLISH LANGUAGE ACQUISITION NATIONAL PROFESSIONAL DEVELOPMENT PROGRAM (NPD) NPD Grant Competition Webinar 2: GPRA & Selection Criteria January.
OFFICE OF ENGLISH LANGUAGE ACQUISITION NATIONAL PROFESSIONAL DEVELOPMENT PROGRAM (NPD) NPD Grant Competition Webinar 3: Evidence and Evaluation January.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Preparing for the Title III Part F STEM Competition Alliance of Hispanic Serving Institutions Educators Grantsmanship Institute March 20, 2016.
Rigor and Transparency in Research
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
DUET.
Investing in Innovation (i3) Fund
until the start of the webinar.
Evidence Based Curriculum & Instruction
Critical Appraisal วิจารณญาณ
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides are intended as guidance only. Please refer to the official documents published in the Federal Register.

Preliminary Results – Not for Citation Agenda  Overview of the Competitive Priority  Defining Evidence: “Strong” vs. “Moderate” What is Evidence of Effectiveness? Criteria for Strong Evidence Criteria for Moderate Evidence  Process for Reviewing Evidence  Questions & answers Live—submission via the webinar chat function Post-webinar: to 2

Preliminary Results – Not for Citation Overview of Competitive Priority  Support programs, practices, and strategies for which there is strong or moderate evidence of effectiveness, awarding up to 5 additional points (p. 13)  Applicants addressing the priority may include up to 5 additional pages in their application narrative, under a separate heading. (p. 27)  Demonstration of supporting evidence for proposed activities should go in Appendix D (pp ): all study “citations,” Web links, copies. MUST BE PUBLICLY AVAILABLE 3

Preliminary Results – Not for Citation Defining Evidence of Effectiveness: Strong and Moderate 4

Preliminary Results – Not for Citation Evidence of Effectiveness: What is it? (1)  Previous studies that isolate the “impact” of the program, practice, strategy; i.e., has to demonstrate that the program caused the improvement (“internal validity”) Not all studies address effects (e.g., use of data to identify a problem, case studies on how to implement a strategy) Studies vary in how rigorously they address internal validity, see definitions section of notice for: (a) different study designs; and (b) “well implemented” At minimum, studies of effectiveness need a well-defined outcome measure and both a treatment (participant) and control/comparison (non-participant) groups 5

Preliminary Results – Not for Citation Study Designs Ordered by Internal Validity  Experimental/randomized controlled trials (RCTs)  Quasi-experimental studies Matched comparison group Regression discontinuity design  Interrupted time series  Correlational analysis  Descriptive  Case Studies  Anecdotes and testimonials 6

Preliminary Results – Not for Citation Caution: Not All Associations Support Causal Inferences 7 (Mis-) Interpretive Statement: Evidence supports proposed grant activity to reduce developmental education class sizes in order to lower the dropout rate. Problem: There are competing explanations for why dropout increases with increases in class size

Preliminary Results – Not for Citation Evidence of Effectiveness: What is it? (2)  Previous studies that pertain to the kinds of participants and settings that are the focus of your grant application (“external validity” or generalizability) Studies will vary in how closely related they are to your population Number of studies of a program or strategy matter: the more “replications” the more confident we can be in study results Size of each study sample matters: more confidence in studies with a large number of participants than in studies with a small number 8

Preliminary Results – Not for Citation Evidence of Effectiveness: Summary of Key Criteria  “Rigor” of study design  Implementation of study design/extent of “flaws”  Number of studies related to your proposed program, practice, strategy  Number of students/institutions involved in studies 9

Preliminary Results – Not for Citation “Strong Evidence” of Effectiveness  High internal validity of the evidence Studies designed/implemented in ways that support conclusions that program “caused” a change/difference in outcomes  High external validity of the evidence Studies based on a sufficient representation of participants and settings that the findings support  Minimum size of evidence base More than one well-designed and well-implemented experimental/RCT or quasi-experimental study OR One large, well-designed and well-implemented multi-site experimental/RCT 10

Preliminary Results – Not for Citation “Moderate Evidence” of Effectiveness  Internal/external validity of the evidence High internal validity and moderate external validity OR High external validity and moderate internal validity  Minimum size of evidence base At least one well-designed experimental/RCT or quasi- experimental study May have small sample sizes or other conditions that limit generalizability, or may fail to demonstrate equivalence between the intervention and comparison groups, but has no other major flaws OR A correlational study with strong statistical controls for selection bias and for discerning the influence of other potential confounds 11

Preliminary Results – Not for Citation Evidence Review Process 12

Preliminary Results – Not for Citation Responsibility for the Evidence Reviews  Institute of Education Sciences (IES) conducts reviews, reports findings to Office of Postsecondary Education  Reviews limited to evidence in Appendix D that are relevant to proposed activities, as outlined in the proposal abstract  IES uses What Works Clearinghouse (WWC) evidence standards and certified WWC reviewers to judge the causal (internal) validity of the evidence Reviewers have doctorates and are tested by WWC Most are faculty but some are IES evaluation contractors 13

Preliminary Results – Not for Citation WWC Standards: What do Reviewers Look For?  Type of design: does the study design allow us to draw causal conclusions?  Strength of data: does the study focus on relevant outcomes and measure them appropriately?  Adequacy of statistical procedures: are the data analyzed properly? 14

Preliminary Results – Not for Citation Evidence Reviews: Strong Evidence 1. Does the evidence include a sufficient number and quality (rigor/implementation) of studies? (1) More than one well-designed and well-implemented experimental/RCT study or well-designed and well-implemented quasi-experimental study? or (2) One large, well-designed and well-implemented multi-site experimental study/RCT? 2. Does the evidence include a reasonable representation of the kinds of participants and settings proposed for SIP grant activities? 15

Preliminary Results – Not for Citation Evidence Reviews: Moderate Evidence 1. Does the evidence include a sufficient number and quality (rigor/implementation) of studies? (1) At least one well-designed experimental/RCT or quasi- experimental study, with either (a) small sample size; (b) conditions of implementation/analysis that limit generalizability; or (c) failure to demonstrate equivalence between the participant and comparison groups or (2) At least one correlational study with strong statistical controls for possible selection bias 2. Is the evidence based on participants and settings that at least overlap with those proposed for SIP grant activities? 16

Preliminary Results – Not for Citation Questions & Answers Please submit your questions on evidence eligibility requirements via the Webinar chat function now. 17

Preliminary Results – Not for Citation Other Important Resources 18 Note: These slides are intended as guidance only. Please refer to the official Notices in the Federal Register. SIP Fund Web site: (  Notices of Final Revisions to Priorities, Requirements, and Selection Criteria  Application Packages for each competition (includes the respective Notice Inviting Applications)  Frequently Asked Questions What Works Clearinghouse Web site: (  Reference Resources, Procedures and Standards Handbook  Quick Review Protocol All questions about the SIP CPP may be sent to: