Goal 2/ Goal 3 In 2016, no Goal 2s accepted; 2017?

Slides:



Advertisements
Similar presentations
David M. Callejo Pérez & Sebastían R. Díaz West Virginia University Collecting, Organizing, and Documenting Student ProgressTeaching Again.
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
The Teacher Work Sample
Educational Specialists Performance Evaluation System
Personal Project REPORT.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
PPA 502 – Program Evaluation
Grant Writing Workshop for Development & Innovation Projects
INACOL National Standards for Quality Online Teaching, Version 2.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Using the T-9 Net This resource describes how schools use the T-9 Net to monitor the literacy and numeracy skills of students in Transition, Year 1 and.
Reaching and Preparing 21st Century Learners
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Interstate New Teacher Assessment and Support Consortium (INTASC)
EDU 385 Education Assessment in the Classroom
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
PRINCIPAL SESSION 2012 EEA Day 1. Agenda Session TimesEvents 1:00 – 4:00 (1- 45 min. Session or as often as needed) Elementary STEM Power Point Presentation.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
DVC Essay #2. The Essay  Read the following six California Standards for Teachers.  Discuss each standard and the elements that follow them  Choose.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
The Proposal AEE 804 Spring 2002 Revised Spring 2003 Reese & Woods.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Interventions Identifying and Implementing. What is the purpose of providing interventions? To verify that the students difficulties are not due to a.
Candidate Assessment of Performance CAP The Evidence Binder.
1 One Common Voice – One Plan School Improvement Stage 3 Plan: Develop School Improvement Plan.
Candidate Assessment of Performance CAP The Evidence Binder.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Teaching and Learning Division National Center for Education Research.
Amendments to the District ESE Policy and Procedures that outline Virtual education guidelines appear in blue. "The noblest pleasure is the joy of understanding."
Teacher Work Sample. Lectures Objectives: 1.Define the teacher work sample. 2.Integrate lesson plans with a practice Teacher Work Sample in terms of the.
Stages of Research and Development
Classroom Assessments Checklists, Rating Scales, and Rubrics
Tiered Instruction and Interventions
Classroom Assessments Checklists, Rating Scales, and Rubrics
Tutorial Welcome to Module 13
Understanding by Design
Action Research Dr. S K Biswas.
Formative assessment through class discussion
The Role of Teachers and Technology in Assessing the CCSS Speaking and
Chapter Eight: Quantitative Methods
Please, try: Your mobiles Participate Feel free.
General Notes Presentation length - 10 – 15 MINUTES
Model Types Instructional Decisions Associated Lesson Plans
california Standards for the Teaching Profession
Mark W. Lipsey Vanderbilt University
Topic Principles and Theories in Curriculum Development
Common Core State Standards AB 250 and the Professional Learning Modules Phil Lafontaine, Director Professional Learning and Support Division.
Mark W. Lipsey Vanderbilt University
Mark W. Lipsey Vanderbilt University
Unit 7: Instructional Communication and Technology
K–8 Session 1: Exploring the Critical Areas
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
Using the 7 Step Lesson Plan to Enhance Student Learning
The Nuts and Bolts of National Board Certification
Standards-based Individualized Education Program (IEP) Module Two: Developing the Present Level of Academic Achievement and Functional Performance (PLOP)
Research Design and Methods
Mark W. Lipsey Vanderbilt University
Chapter 4 Instructional Media and Technologies for Learning
Tips for Writing Proposals
Coaching and Collaboration
Presentation transcript:

Goal 2/ Goal 3 In 2016, no Goal 2s accepted; 2017?

Goal 2: Development & Innovation Goal Development process must be iterative! Develop an innovative intervention (e.g., curriculum, instructional approach, program, or policy) OR improve existing education interventions AND collect data on its usability, feasibility, and fidelity of implementation in actual education settings AND collect pilot data on student outcomes Development & Innovation 4 years, $1,500,000

Goal 3: Efficacy & Replication Goal Evaluate whether or not a fully developed intervention is efficacious under limited or ideal conditions OR Gather follow-up data examining the longer-term effects of an intervention with demonstrated efficacy Replicate an efficacious intervention varying the original conditions Conduct retrospective analysis of secondary data collected in the past Efficacy & Replication Follow-up study 4 years, $3,500,000 3 years, $1,200,000

Intervention Research Proposal Components: IES Goal 2 or 3 Formatted Abstract (1 page) Major Sections of Research Narrative (25 pages) Project Significance Research Aims Intervention Theory of Change Rationale Research Plan Sample and setting Research design Timeline and procedures Measures proximal and distal outcomes fidelity of intervention implementation & control group practices key moderators or mediators Data analytic plan Detailed power analysis Cost analysis Resources (to conduct research AND disseminate intervention) Personnel Appendices Response to Reviews; Measures; Intervention Materials; Letters of Support; Data Management Plan

Goal 2 vs. Goal 3 Both Require: A plausible rationale that the intervention is needed; reason to believe it has advantages over what’s currently proven and available A well-specified intervention model basis in theory and prior research identified target population specification of intended outcomes/effects “theory of change” explication of what it does and why it should have the intended effects for the intended population Clarity about the relevant counterfactual– what it is supposed to be better than

Goal 2 vs. Goal 3 The Goal 3 also requires (and the Goal 2 can help provide): Evidence that the intervention is ready to deliver: Complete manual, with detailed content and instructions for implementation Finalized and ready-to-go materials and training procedures Evidence that the intervention can be delivered as intended and implemented well enough in practice to plausibly have effects Completed implementation guides, procedures, measures Some evidence that it can produce the intended effects and evidence of anticipated effect size

Common Aims for Goal 2 Develop the intervention curriculum and associated materials. Develop standardized training procedures and implementation guidelines. Design and test a supervisory or professional development model to support high-fidelity implementation. Use iterative field tests to refine intervention & estimate effects. Assess acceptability to different audiences. Collect input/feedback to adapt program materials for diverse samples. Assess measures for their sensitivity to change. Evaluate the feasibility of the sampling, recruitment, screening, enrollment procedures, and program delivery

Common Aims for Goal 3 Aim 1: Test the efficacy of the intervention using a rigorous, randomized-controlled design. Aim 2: Test hypothesized mechanisms of change; explore change processes. [mediation] Aim 3: Examine variations in intervention implementation/engagement/acceptability or impact and assess moderators that may account for these variations. [moderation]

Significance Section: Content to Cover Describe the nature of the need addressed: what and for whom is this intervention intended Identify the proximal and distal targets of the intervention and why are they targeted Articulate the theory and evidence linking the intervention approach to change in these targeted outcomes Be sure to convey: The key factors or components most distinctive in this intervention and how they differ from the usual practice/control condition TIP: It is always a good idea to provide a graphic to summarize the logic model of the intervention

Research Plan: Start with an Overview and Timeline *Paragraph summarizing the targeted population and sampling frame and general design framework. Provide a table showing the time-line.

Research Plan: Sample, Setting, Design Sampling must be strategic rather than convenience, with evidence of: cooperative schools, teachers, parents, & administrators willing to participate student sample appropriate in terms of representativeness and size for showing educationally meaningful effects access to students (e.g., for testing), records, classrooms (e.g., for observations) Settings are evaluated In terms of feasibility, cost-effectiveness, and sustainability of the intervention Generalizability of findings from the targeted settings Randomization and design descriptions should specify: Level of randomization Use of hierarchical designs (accounting for classrooms/schools) Protections against threat of intervention contamination Control group comparison (what is “usual practice”)

Intervention Description Highlight key components that make the intervention distinctive Describe the mechanisms of action of the key components Illustrate how the components will be operationalized Use graphics and tables to make this information easily accessible

Domain of Social-Emotional Competencies   Domain of Social-Emotional Competencies Target Skill Domains Prosocial Skills Help, share, cooperate Emotion Knowledge Label feelings, empathy Self-control Follow rules, inhibit aggressive impulses Social Problem-Solving Identify problems, generate solutions. Curriculum Materials (Manuals) PATHS: Friendship lessons & modeling stories PATHS: Feeling lessons, feeling faces PATHS: Turtle technique (stop, calm down, say the problem & feelings) PATHS: Problem-solving sequence & modeling stories Teaching Strategies (BKC) Positive class management: rules, routines, praise Emotion coaching: modeling & reflective listening Prompting use of the turtle technique, using induction Using problem-solving dialogue; facilitating communication Domain of Language and Emergent Literacy Skills Oral Language Vocabulary Narrative Temporal sequence, cause and effect Phonological Sensitivity Sequencing sounds Print Awareness Letter knowledge and letter sounds Dialogic Reading: Targeted vocabulary Story review Sound Games: Listening, sequencing words & syllables Alphabet Center: Multi-media exposure to letters and letter sounds Intentional repeated exposure to target vocabulary Rich language use, sensitive responding, questions, expansions Sequence developmentally; pace for mastery Monitoring knowledge acquisition; pacing to support mastery

Levels of Measurement fidelity of intervention implementation & control group practices proximal outcomes distal outcomes key moderators/mediators control variables/confounders

Measuring Intervention Fidelity Adherence/compliance: Evidence that the intervention content was delivered as intended Quality of the delivery: Evidence that the key processes (theory-based characteristics) of intervention delivery occurred as intended Dose/exposure : Degree to which the desired amount of intervention was delivered to and received by participants; Participant responsiveness : Degree to which participants were engaged and responsive; acceptability of intervention to participants Dane & Schneider (1998)

Organize the Measures to Allow for Easy Categorization

Issues in Measure Selection Reviewers often express concern about: Feasibility– time and resources required Respondent burden Developmental appropriateness Sensitivity to change in the intervention Appropriateness for special populations (disabilities, English language learners) Too many measures – How will you reduce? What is the PRIMARY outcome – or how will you know if the intervention worked?

Issues Regarding Plan of Analyses Sophisticated and detailed analytic plans are expected General approaches to data cleaning, reduction, scoring, transforming Detailed plans for how each aim will be analyzed Pay special attention to hierarchical models Provide a very detailed and well-justified power analysis

Newer Elements Cost Assessment Intervention Fidelity Check after Year 1 (and remediation plan) Dissemination Plan/Resources Data Management Plan

Other Things to Think About Most review committees have mixed experts: Content area specialists (some with no intervention experience/knowledge) Methodological specialists Intervention specialists (may have a strong theoretical affiliation) Try to have your proposal read by a mix of folks who can represent these different perspectives

Other thoughts, questions, suggestions?