Download presentation
Presentation is loading. Please wait.
Published byKenneth French Modified over 9 years ago
1
ies.ed.gov Connecting Research, Policy and Practice June 4, 2013 Elizabeth Albro, Ph.D. Associate Commissioner, National Center for Education Research Joan McLaughlin, Ph.D. Deputy Commissioner, National Center for Special Education Research IES Grant Writing Workshop
2
ies.ed.gov Purpose of the Workshop This workshop will provide instruction and advice on writing a successful application to the Institute of Education Sciences’research grant programs, specifically to the: Education Research Grants Program (84.305A) Special Education Research Grants Program (84.324A)
3
ies.ed.gov Agenda Introduction to IES Grant Research Topics Grant Research Goals Four Sections of the Project Narrative – Significance – Research Plan – Personnel – Resources 3
4
ies.ed.gov What is IES? Research arm of the U.S. Department of Education, non-partisan by law. Charged with providing rigorous and relevant evidence on which to ground education practice and policy and share this information broadly. By identifying what works, what doesn't, and why, we aim to improve educational outcomes for all students, particularly those at risk of failure.
5
ies.ed.gov IES Organizational Structure Office of the Director National Board for Education Sciences National Center for Education Evaluation National Center for Education Statistics National Center for Education Research National Center for Special Education Research Standards & Review Office 5
6
ies.ed.gov Missions of the Research Centers NCER – Supports rigorous research that addresses the nation’s most pressing education needs, from early childhood to adult education. NCSER – Sponsors a rigorous and comprehensive program of special education research designed to expand the knowledge and understanding of infants, toddlers, and students with or at risk for disabilities from birth through high school. 6
7
ies.ed.gov Overall Research Objectives Develop or identify education interventions (i.e., practices, programs, policies, and approaches) that enhance academic achievement and that can be widely deployed Identify what does not work and thereby encourage innovation and further research Understand the processes that underlie the effectiveness of education interventions and the variation in their effectiveness 7
8
ies.ed.gov Primary Research Grant Programs Education Research Grants (84.305A) Special Education Research Grants (84.324A)* These grant programs are organized by research topic and research goal. *84.324A is not being competed for FY 2014 8
9
ies.ed.gov Special Education Research NCSER will not hold research or research training competitions for FY 2014 If funds for research are available in FY 2014, NCSER will use these funds to make additional awards from the FY 2013 grant slates NCSER anticipates being able to hold a grant competition for FY 2015 8/16/2015 9
10
ies.ed.gov Opportunities for the Study of Individuals with Disabilities in NCER Partnerships & Collaborations Focused on Problems of Practice or Policy grants program (84.305H) Postsecondary & Adult Education topic of the Education Research Grants competition (84.305A) 8/16/2015 10
11
ies.ed.gov NCER Ultimate Outcomes of Interest: Student Outcomes GradeOutcome PrekindergartenSchool readiness (e.g., pre-reading, language, vocabulary, early math and science knowledge, social and behavioral competencies) Kindergarten – Grade 12 Learning, achievement, and higher-order thinking in reading, writing, mathematics, and science; progress through the education system (e.g., course and grade completion or retention, high school graduation, and dropout); social skills, attitudes, and behaviors that support learning in school
12
ies.ed.gov NCER Ultimate Outcomes of Interest: Student Outcomes GradeOutcome Postsecondary (grades 13 – 16) Access to, persistence in, progress through, and completion of postsecondary education; for students in developmental programs, additional outcomes include achievement in reading, writing, English language proficiency, and mathematics Adult Education (Adult Basic Education, Adult Secondary Education, Adult ESL, and GED preparation) Student achievement in reading, writing, and mathematics; access to, persistence in, progress through, and completion of adult education programs
13
ies.ed.gov Agenda Introduction to IES Grant Research Topics Grant Research Goals Four Sections of the Project Narrative – Significance – Research Plan – Personnel – Resources 13
14
ies.ed.gov Grant Topics All applications to the primary research grant programs must be directed to a specific topic – Note on SF 424 Form, Item 4b (Agency Identifier Number) – Note at top of Abstract and Project Narrative
15
ies.ed.gov Education Research Topics (84.305A) Cognition & Student Learning Early Learning Programs & Policies Education Technology Effective Teachers & Effective Teaching English Learners Improving Education Systems: Policies, Organization, Management, & Leadership Mathematics & Science Education Postsecondary & Adult Education Reading & Writing Social & Behavioral Context for Academic Learning 15
16
ies.ed.gov Issues about Topics All require student outcomes Grade range may vary by topic Topics can overlap
17
ies.ed.gov Choosing among Overlapping Topics What literature are you citing? To which topic is your area of expertise best aligned? If your focus is on a specific population of students/teachers, go to that program/topic: – Is your focus on a specific type of student/teacher (e.g., English Learners), or are you studying them as a subgroup of your sample?
18
ies.ed.gov Issues about Topics Pre-service programs – Only exploratory research can be done on teacher pre-service programs – no development of pre-service programs, evaluation of them, or measures-development for them – Can develop or evaluate pre-service components with in-service teachers – Support for leadership pre-service programs if the programs last 24 months or less
19
ies.ed.gov Grants Primarily Focused on Professional Development for K -12 Teachers Many topics that study K-12 education now require PD grants to be submitted under the Effective Teachers & Effective Teaching topic – Early Learning Programs & Policies – Effective Teachers & Effective Teaching Cognition & Student Learning Education Technology English Learners Improving Education Systems: Policies, Organization, Management, and Leadership Mathematics & Science Education Reading & Writing – Postsecondary & Adult Education – Social & Behavioral Context for Academic Learning 19
20
ies.ed.gov Agenda Introduction to IES Grant Research Topics Grant Research Goals Four Sections of the Project Narrative – Significance – Research Plan – Personnel – Resources 20
21
ies.ed.gov Grant Research Goals All applications to 84.305A must be directed to a specific goal – Note on SF 424 Form, Item 4b – Note at top of Abstract and Research Narrative The goal describes the type of research to be done Every application is directed to a specific topic/goal combination
22
ies.ed.gov What Topic X Goal Fits Your Project? 8/16/2015 22
23
ies.ed.gov FY 2014 Research Goals Exploration Development & Innovation Efficacy & Replication Effectiveness Measurement 23
24
ies.ed.gov Purpose of Exploration Projects To identify malleable factors associated with student outcomes AND/OR To identify factors and conditions that may mediate or moderate relations between malleable factors and student outcomes
25
ies.ed.gov Malleable Factors Malleable factors must be under the control of the education system – Something that can be changed by the system Examples – Student characteristics: behavior, skills – Teacher characteristics: practices, credentials – School characteristics: size, climate, organization – Education interventions: practices, curricula, instructional approaches, programs, and policies
26
ies.ed.gov Possible Methodological Approaches for Exploration Analyze secondary data Collect primary data Complete a meta-analysis 26
27
ies.ed.gov Awards for Exploration Secondary data analysis or meta-analysis: – Maximum of $700,000 total cost (direct + indirect) – Maximum of 2 years Primary data collection and analysis (with or without secondary analysis): – Maximum of $1,600,000 total cost (direct + indirect) – Maximum of 4 years Applications proposing more than a maximum will not be accepted for review
28
ies.ed.gov Purpose of Development & Innovation Projects Develop an innovative intervention (e.g., curriculum, instructional approach, program, or policy) OR improve existing education interventions 28 AND collect data on feasibility and usability in actual education settings AND collect pilot data on student outcomes. Development process must be iterative!
29
ies.ed.gov Range of Options for Pilot Study Efficacy study (randomized controlled trial) Underpowered efficacy study (randomized controlled trial with small sample size that provides unbiased effect size estimates) Single-case study that meets the design standards of WWC Quasi-experimental study based on the use of comparison groups with adjustments to address potential differences between groups (i.e., use of pretests, control variables, matching procedures) 29
30
ies.ed.gov Awards for Development Maximum of $1,500,000 total cost (direct + indirect) Maximum of 4 years In budget narrative, note budgeted cost of pilot study to ensure it does not exceed 35% of total funds Applications proposing more than a maximum will not be accepted for review 30
31
ies.ed.gov Efficacy & Replication 3 types of Efficacy and Replication projects With or without reservations Randomized controlled trial (RCT) favored Strong quasi-experiment 31 Design must meet What Works Clearinghouse evidence standards!
32
ies.ed.gov Purpose #1 Evaluate whether or not a fully developed intervention is efficacious under limited or ideal conditions – Widely-used intervention – Intervention not widely used – Possible to do so through a retrospective analysis of secondary data collected in the past OR 32
33
ies.ed.gov Purpose #2 Replicate an efficacious intervention varying the original conditions – Different populations of students (e.g., English language learners) – Education personnel (e.g., reading specialists versus classroom teachers) – Setting (e.g., urban versus rural) OR 33
34
ies.ed.gov Purpose #3 Gather follow-up data examining the longer term effects of an intervention with demonstrated efficacy – Students – Education personnel carrying out intervention 34
35
ies.ed.gov Key Features of Efficacy & Replication Goal Ask what might be needed to implement intervention under routine practice Consider role of developer to avoid conflict of interest for developer-evaluators Do not require confirmatory mediator analyses but recommend exploratory ones 35
36
ies.ed.gov Awards for Efficacy & Replication Efficacy Maximum of $3,500,000 total cost (direct + indirect) Maximum of 4 years Efficacy Follow-Up Maximum of $1,200,000 total cost (direct + indirect) Maximum of 3 years Applications proposing more than a maximum will not be accepted for review 36
37
ies.ed.gov Purpose of Effectiveness Projects Evaluate whether a fully developed intervention that has evidence of efficacy is effective when implemented under routine conditions through an independent evaluation OR Gather follow-up data examining the longer term impacts of an intervention implemented under routine conditions on students 37
38
ies.ed.gov Effectiveness Goal IES expects researchers to – Implement intervention under routine practice – Include evaluators independent of development/distribution – Describe strong efficacy evidence for intervention (from at least 2 previous studies) Does not expect wide generalizability from a single study – Expects multiple Effectiveness projects to this end – Sample size is not a key distinction from Efficacy Does not require confirmatory mediator analyses but encourages exploratory ones Cost of implementation is limited to 25% of budget 38
39
ies.ed.gov Awards for Effectiveness Effectiveness Maximum of $5,000,000 total cost (direct + indirect) Maximum of 5 years Effectiveness Follow-Up Maximum of $1,500,000 total cost (direct + indirect) Maximum of 3 years Applications proposing more than a maximum will not be accepted for review 39
40
ies.ed.gov Purpose of Measurement Grants Develop new assessments Refine existing assessments (or their delivery) – To increase efficiency – To improve measurement – To improve accessibility – To provide accommodations Validate existing assessments – For specific purposes, contexts, and populations 40
41
ies.ed.gov Focus of Measurement Grants Assessments may also be developed in other goals, but not as the primary focus Primary product of measurement grant is the design, refinement, and/or validation of an assessment 41
42
ies.ed.gov Awards Maximum of $1,600,000 total cost (direct + indirect) Maximum of 4 years Applications proposing more than a maximum will not be accepted for review 42
43
ies.ed.gov Attend to Changes from Previous 84.305A See Page 11 for highlights of changes in the FY 2014 RFA. Carefully read the full RFA. Applicants to all goals must describe plans for dissemination as appropriate to the proposed work. 43
44
ies.ed.gov Expected Products Expected Products for each goal can help you identify the right goal for your project At the end of a funded project, IES expects you to provide…
45
ies.ed.gov Expected Products for Exploration Clear description of malleable factors (and mediators/moderators) and empirical evidence of link between malleable factors (and mediators/moderators) and student outcomes Clear conceptual framework - theory Determination about next steps (do results suggest future Goal 2, Goal 3, or Goal 5 project?)
46
ies.ed.gov Expected Products for Development & Innovation Fully developed version of the intervention – including supporting materials – a theory of change – evidence that intended end users understand and can use the intervention Data that demonstrate end users can feasibly implement the intervention Pilot data regarding promise for generating the intended beneficial student outcomes – including fidelity measures – evidence of implementation fidelity
47
ies.ed.gov Expected Products for Efficacy & Replication Evidence of intervention impact on relevant student outcomes relative to a comparison condition using a research design that meets (with or without reservation) WWC standards Conclusions on and revisions to relevant conceptual framework (i.e., the theory of change) If beneficial impact - identification of organizational supports, tools, and procedures needed for sufficient implementation in future Replication or Effectiveness study If no beneficial impact - determination of whether a future Goal 2 is needed to revise intervention/implementation
48
ies.ed.gov Expected Products for Effectiveness Evidence of intervention impact under routine implementation conditions on relevant student outcomes relative to a comparison condition using a research design that meets (with or without reservation) WWC standards Conclusions on and revisions to relevant conceptual framework (i.e., the theory of change) If beneficial impact - identification of organizational supports, tools, and procedures needed for sufficient implementation under routine conditions If no beneficial impact – examination of why findings differed from previous efficacy studies and a determination of whether a future Goal 2 is needed to revise intervention/implementation
49
ies.ed.gov Expected Products for Measurement Projects to develop/refine and validate an assessment: – Description of the assessment and its intended use – Description of the iterative development processes used to develop/refine the assessment, including field testing procedures and processes for item revision – Conceptual framework for the assessment and its validation activities – Description of the validation activities – Reliability and validity evidence for specific purpose(s), populations, and contexts Projects to validate an existing assessment: – Conceptual framework for the assessment and its validation activities – Description of the validation activities – Reliability and validity evidence for specific purpose(s), populations, and contexts
50
ies.ed.gov GoalMax. Duration & Award (direct + indirect) Exploration With secondary data With primary data 2 years, $700,000 4 years, $1,600,000 Development & Innovation4 years, $1,500,000 Efficacy & Replication Follow-up study 4 years, $3,500,000 3 years, $1,200,000 Effectiveness Follow-up study 5 years, $5,000,000 3 years, $1,500,000 Measurement4 years, $1,600,000 Maximum Award Amounts (84.305A) 50
51
ies.ed.gov NCER Grants by Goal (2004-2012)
52
ies.ed.gov Agenda Introduction to IES Grant Research Topics Grant Research Goals Four Sections of the Project Narrative – Significance – Research Plan – Personnel – Resources 52
53
ies.ed.gov The Application’s Project Narrative Key part of your application 4 Sections – Significance – Research Plan – Personnel – Resources Each section scored and an overall score given Requirements vary by program & goal 25 pages, single spaced
54
ies.ed.gov Significance Describes the overall project – Your research question to be answered; intervention to be developed or evaluated, or measure to be developed and/or validated Provides a compelling rationale for the project – Theoretical justification Logic Models, Change Models – Empirical justification – Practical justification
55
ies.ed.gov Significance Do not assume reviewers know significance of your work Do not quote back RFA on general importance of a topic, – e.g., RFA paragraph on lack of reading proficiency of 8 th and 12 th graders based on NAEP data Do quote back RFA if a specific topic is highlighted and your work will address that topic – E.g., disproportionality in discipline (Social/Behavioral); need for developmentally appropriate measures of kindergarten readiness (Early Learning)
56
ies.ed.gov Significance: Exploration Describe the malleable factors, moderators, and mediators to be examined & how you will measure them Justify their importance – Theoretical rationale – Empirical rationale – Practical importance How work will lead to useful next step – Development or modification of interventions to address the identified malleable factors or underlying process to improve student outcomes – Identification of interventions for more rigorous evaluation – Conceptual framework for developing or refining an assessment
57
ies.ed.gov Significance: Development & Innovation Context for proposed intervention – Why needed: what problem exists – What exists now (may be many alternatives already) Detailed description of intervention to be developed – Clearly identify components already developed, partially developed, and to be developed (no jargon) – Don’t overextend (# grades, full vs. part year) Theory of change (theoretical support) Empirical support
58
ies.ed.gov Practical importance: – Meaningful impact, feasibility, affordability Answer the question: Why will this intervention produce better student outcomes than current practice? 58 Significance: Development & Innovation
59
ies.ed.gov Significance: Efficacy & Replication Detailed description of intervention – Show fully developed, implementation process, and ready to be evaluated Justification for evaluating the interventio n – Importance of practical problem it is to address – If in wide use, show it has not been rigorously evaluated – If not in wide use, show evidence of feasibility and promise to address the practical problem Theory of change – Theoretically and empirical rationale – Direct impact on student outcomes or through mediators Justify that it could lead to better outcomes than current practice
60
ies.ed.gov Significance: Effectiveness Detailed description of intervention Justification for evaluating the intervention – Evidence of meaningful impacts (from 2 Efficacy studies) Theory of change Justify that it could lead to better outcomes than current practice Implementation under normal conditions Independent evaluation Evidence that implementation can reach high enough fidelity to have meaningful impacts
61
ies.ed.gov Significance: Measurement Specific need and how assessment will be important to the field of education research, practice, and stakeholders Current assessments, why they are not satisfactory, and why the new assessment will be an improvement Conceptual framework and empirical evidence for the proposed assessment, including key components How grant activities will provide convincing evidence of reliability and validity for intended purposes and populations If grant is to further develop/refine an assessment from previous measurement award, describe status of previous award and justify need for further development
62
ies.ed.gov Significance – 2 Key Problem Areas 1.Description of Malleable Factor/Intervention – Unclear what intervention is: confuses reviewers Many components and may be applied at different times – how fit together – Graphic may help – Unclear how to be implemented to ensure fidelity – Intervention not shown to be strong enough to expect an impact Especially true for information interventions – e.g., provide data on students, short teacher workshops – Overly focused on actions not content Ex.: 20 hours of PD held over 10 weeks but no detail on what is to be covered in the sessions
63
ies.ed.gov Significance – 2 Key Problem Areas 2.Theory of change – Why a malleable factor should be related to a student outcome – Why an intervention should improve outcomes versus current practice – Why an assessment/instrument should measure a specific construct – A well laid out theory of change makes clear what is expected to happen and in what order – Easy for reviewers to understand research plan – why measure certain outcomes – Graphic can be helpful
64
ies.ed.gov Theory of Change Example A pre-K intervention – Early Learning Programs and Policies: e.g., a specific practice, a curriculum, expanded access, full day pre-K The Problem – what and for whom (e.g., children who are not ready for kindergarten) – why (e.g., weak pre-reading skills, cannot focus) – where does it fit in the developmental progression (e.g., prerequisites to decoding, concepts of print) – rationale/evidence supporting these specific intervention targets at this particular time
65
ies.ed.gov Theory of Change Example How the intervention addresses the need and why it should work – content: what the student should know or be able to do; why this meets the need – pedagogy: instructional techniques and methods to be used; why appropriate – delivery system: how the intervention will arrange to deliver the instruction Describe what aspects of the above are different from the counterfactual condition Describe the key factors or core ingredients most essential and distinctive to the intervention
66
ies.ed.gov Logic model graphic 4 year old pre-K children Exposed to intervention Positive attitudes to school Improved pre- literacy skills Learn appropriate school behavior Increased school readiness Greater cognitive gains in K Target Population InterventionProximal OutcomesDistal Outcomes
67
ies.ed.gov Mapping Sample Characteristics onto Theory 4 year old pre-K children Exposed to intervention Positive attitudes to school Improved pre- literacy skills Learn appropriate school behavior Increased school readiness Greater cognitive gains in K Sample descriptors: basic demographics diagnostic, need/eligibility identification Potential moderators: setting, context personal and family characteristics prior experience
68
ies.ed.gov Mapping Intervention Characteristics onto Theory 4 year old pre-K children Exposed to intervention Positive attitudes to school Improved pre- literacy skills Learn appropriate school behavior Increased school readiness Greater cognitive gains in K Independent variable: T vs. C experimental condition Generic fidelity: T and C exposure to the generic aspects of the intervention (type, amount, quality) Specific fidelity: T and C(?) exposure to distinctive aspects of the intervention (type, amount, quality) Potential moderators: characteristics of personnel intervention setting, context e.g., class size
69
ies.ed.gov Mapping Outcomes onto Theory 4 year old pre-K children Exposed to intervention Positive attitudes to school Improved pre- literacy skills Learn appropriate school behavior Increased school readiness Greater cognitive gains in K Focal dependent variables: pretests (pre-intervention) posttests (at end of intervention) follow-ups (lagged after end of intervention Other dependent variables: construct controls– related DVs not expected to be affected side effects– unplanned positive or negative outcomes mediators– DVs on causal pathways from intervention to other DVs
70
ies.ed.gov Logic Model Graphics Don’t Do This! Overwhelm the reader Use color as a key (applications are reviewed in black and white)
71
ies.ed.gov PLT Profile Analysis Set instructional goals & WL focus PLT Profile Analysis Set instructional goals & WL focus [4 weeks at end of prior year or beg of current year] PLT Begins weekly meetings PLT Begins weekly meetings PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief PLT WL Debrief [Processes 4-11 repeat to mid-year] [PLT appoints SLT1 to address PD Topic 1] SLT 1 Research SLT 1 Research SLT 1 Implement Prof Devt SLT 1 Implement Prof Devt DEVELOPMENT MODEL FOR “WL” INTERVENTION DEVELOPED BY “ABC” WITH THE ASSISTANCE OF “DEF” PLT = Primary Leadership team SLT = Secondary Leadership team = Begin Process = Feedback for next process = Delegation of PD KEY 10. ABC reviews/revise s model based on DEF findings 14a. SLT and Coaches create PD unit 1a. PI Recruits and Trains Coaches 6a. Coaches facilitate PLT identification of annual instr. goal 1b. DEF develops data collection tools 11. Coaches share/ implement revisions with PLTs 3a. Coaches collect 3 yrs. stud. ach. & demo. data per school 2a. ABC PI matches Coaches to schools (n=5) 2b. DEF trains coaches to use all data tools 5. DEF guides Coaches and PLTs in constructing and analyzing profiles based on data collected 4. Coaches and DEF work with PLT to collect climate data 3b. DEF develops electronic profile develop- ment tool 7a. Coaches teach PLT to conduct Learning Walks (WL) approx. 3 wks after school begins 8a2. Coaches assist PLTs in weekly WLs and Debriefing through mid-year 6-8b. DEF observes PLTs, documents implementation 9. DEF interviews coaches and PLTs about WL process approx. 9 wks after school begins 13a.ABC researchers train SLT to research best-practices in PD area 12. Coaches & PLTs choose 2-4 teachers (based on WLs) to become Dynamic Leadership Team 1] 13-15b. DEF observes/consul ts SLTs, documents implementation 15a.Coaches assist SLT in implementing PD with faculty 16. DEF interviews coaches, SLTs, and PLTs about PD and WL processes 17. ABC reviews/revise s model based on DEF findings 8a1. Coaches assist PLTs in using all data to ID area for prof devt [Processes 3-17 repeat twice in Year 2] 18a. ABC and DEF submit Annual Report to DOE and schools
72
ies.ed.gov Research Plan Describe the work you intend to do – How you will answer your research question; develop your intervention; evaluate the intervention, or develop and/or validate your assessment Make certain Research Plan is aligned to Significance section – All research questions should have justification in Significance Step-by-step process – A timeline is strongly recommended!
73
ies.ed.gov Logic model graphic: Setting/Population/Sample 4 year old pre-K children Exposed to intervention Positive attitudes to school Improved pre- literacy skills Learn appropriate school behavior Increased school readiness Greater cognitive gains in K Target Population InterventionProximal OutcomesDistal Outcomes
74
ies.ed.gov Setting, Population, and Sample Identify the places you will be doing research Identify the population you are addressing Identify the sample – Inclusion and exclusion criteria – Sample size (issues of power for analysis) – The importance of attrition and how to address it – External validity: can you generalize to your population or only to a subset of it If using secondary data, discuss these for the datasets you will be using
75
ies.ed.gov Logic model graphic: Measures 4 year old pre-K children Exposed to intervention Positive attitudes to school Improved pre- literacy skills Learn appropriate school behavior Increased school readiness Greater cognitive gains in K Target Population InterventionProximal OutcomesDistal Outcomes
76
ies.ed.gov Outcome Measures For both proximal and distal outcomes Sensitive (often narrow) measures Measures of broad interest to educators Measures not expected to be linked can be used as additional evidence Describe reliability, validity, and relevance Do not include measures not linked to research questions Multiple comparison issue gaining importance
77
ies.ed.gov Other Measures Measures that feed back into iterative development process Fidelity of Implementation – Operating as intended – Able to address comparison groups Feasibility Measurement Projects: – Alternate forms – horizontal equating – Vertical equating if measure growth – Test fairness – Non-student instruments must be validated against student outcomes
78
ies.ed.gov Measures Derived From Qualitative Research Can be derived from qualitative research (surveys, observations, focus groups, interviews) – Actual items to be used – How items link to constructs – the validity of these measures – Procedures for collection and coding (address inter-rater reliability) – How consent obtained for an adequate percent of sample – How qualitatively collected measures are used in analysis of quantitative outcomes (e.g., test scores)
79
ies.ed.gov Research Design Start off with your research questions The research design should answer your questions – Do not have the design section written independently by a methodologist – If sections are written by different people have everyone read through the whole application Issues common to designs across goals – Attrition and missing data – Obtain access and permission to collect/use data
80
ies.ed.gov Research Design (cont.) Experiments allowed under Exploration and Development – small-scale experiments may provide reliable information to meet the purpose/requirements of the goal – not intended to test for efficacy 80
81
ies.ed.gov Research Design Varies by Goal Exploration – Primary data Sampling strategy Data collection and coding processes Small-scale, tightly controlled experiments allowed – Secondary data Descriptive analysis Statistical correlational analysis Analyses attempting to address selection issues: PSM Mediation analyses
82
ies.ed.gov Research Design Varies by Goal Development & Innovation – Focus should be on iterative development process – Feasibility study: use in authentic education setting – Type of pilot study depends upon Complexity of intervention, Level of intervention implementation, 30% limit on grants funds that can be used for pilot study Continuum of Rigor (RCT, under-powered RCT, Single-Case Study, QED with Comparison group)
83
ies.ed.gov Research Design Varies by Goal Efficacy & Replication – RCT favored Unit of randomization and justification Procedures for assignment – Strong quasi-experiment - justify RCT not possible How it reduces or models selection bias Discuss threats to internal validity – conclusions to be drawn
84
ies.ed.gov Research Design Varies by Goal Efficacy & Replication (cont.) – Describe the control/comparison group – Power analysis/MDES – show calculation and assumptions – Fidelity of implementation study in both T and C – Mediator and moderator analyses – Contamination issues: schools vs. classrooms 84
85
ies.ed.gov Research Design Varies by Goal Effectiveness – Same as Efficacy & Replication except requires implementation under routine conditions independent evaluator Cost-feasibility analysis
86
ies.ed.gov Research Design Varies by Goal Measurement – The plan to develop or refine the assessment Evidence of constructs Interpretation of assessment results Item development and selection Procedures for administering and scoring – Reliability and validity studies 86
87
ies.ed.gov Analysis Depends on Design Describe how your analysis answers your research questions Describe analyses of qualitative data
88
ies.ed.gov Analysis (cont.) Show your model – Identify coefficients of interest and their meaning – Show different models for different analyses – Include Equations Address clustering Describe plan for missing data – check for equivalency at start and attrition bias Use sensitivity tests of assumptions 88
89
ies.ed.gov Personnel Section Describe key personnel – Show that every aspect of project has person with expertise to do it Appropriate methodological expertise Substantive person for all issues addressed Do not propose to hire a key person with X expertise Project management skills – Show that every aspect of project has enough time from an expert Orient CVs so specific to project – 4 pages plus 1 page for other sources of support 89
90
ies.ed.gov Personnel Strategies for PI Senior Researcher as PI – Show adequate time to be PI – Make credentials clear (not all reviewers may know) Junior Researcher as PI – Show you have adequate expertise not only to do work but to manage project Continuation of graduate research Management skills as graduate student – Reviewers more comfortable if you have senior person(s) on project to turn to for advice Co-PI, Co-I, contractors, advisory board Have them on for enough time to be taken seriously 90
91
ies.ed.gov Resources Show institutions involved have capacity to support work – Do not use university boilerplate Show that all organizations involved understand and agree to their roles – What will each institution, including schools, contribute to the project – Show strong commitment of schools and districts and alternatives in case of attrition If you have received a prior grant award for similar work, describe the success of that work 91
92
ies.ed.gov Resources (cont.) Appendix C should back up the Resources section Detailed Letters of Agreement from research institutions, States, districts, schools – Do letters show that partners understand their role in the project (e.g., random assignment to condition; time commitments)? – Do letters show that you have access to all necessary data to do the proposed work? 92
93
ies.ed.gov Critical Issues for Project Narrative Opening Paragraph Clarity of Writing 93
94
ies.ed.gov Opening Paragraph Critical Opening paragraph sets the scene for the reviewers – Identifies the significance of the work to be done and what actually will be done – Reviewers use it to create categorization system to organize information in rest of applications – You can lose your reviewers right away with an unclear opening
95
ies.ed.gov Importance of Clarity of Writing Reviewers complain about lack of clarity – Significance too general – Lack of detail regarding intervention, development cycle, or data analysis – Use of jargon and assumptions of knowledge – Sections don’t support one another
96
ies.ed.gov Budget and Budget Narrative Provide a clear budget and budget narrative for overall project and each sub-award Provide detail on the assumptions used in the budget (e.g., assumptions for travel) IES Grants.gov Application Submission Guide describes budget categories Check RFA for specific budget requirements for Research Goals Ensure alignment among Project Narrative, Budget, and Budget Narrative 96
97
ies.ed.gov Other IES Funding Opportunities Research Programs Statistical and Research Methodology in Education (84.305D) – Statistical and Research Methodology Grants – Early Career Statistical and Research Methodology Grants Partnerships and Collaborations Focused on Problems of Practice or Policy Competition (84.305H) – Researcher-Practitioner Partnerships in Education Research – Continuous Improvement in Education Research – Evaluation of State and Local Education Programs and Policies Education Research and Development Centers (84.305C) – Developmental Education Assessment and Instruction – Knowledge Utilization 97
98
ies.ed.gov Other IES Funding Opportunities (cont’d) Research Training Programs Research Training Programs in the Education Sciences (84.305B) – Predoctoral Interdisciplinary Research Training – Methods Training for Education Researchers – Training in Education Research Use and Practice 98
99
ies.ed.gov Application Deadline Letter of Intent Due Date Application Package Posted Start Dates Sept 4, 2013 4:30:00 PM DC Time June 6, 2013 July 1, 2014 to Sept 1, 2014 Important Dates and Deadlines 99
100
ies.ed.gov Finding Application Packages FY 2014 Application Packages will be available on www.grants.gov 100
101
ies.ed.gov 101
102
ies.ed.gov Review Application Requirements Request for Applications Currently available at http://ies.ed.gov/funding Grants.gov Application Submission Guide Will be available 6/6/2014 at http://ies.ed.gov/fundinghttp://ies.ed.gov/funding Application Package Will be available on Grants.gov on 6/6/2014 102
103
ies.ed.gov Peer Review Process Applications are reviewed for compliance and responsiveness to the RFA Applications that are compliant and responsive are assigned to a review panel Two or three panel members conduct a primary review of each application The most competitive applications are reviewed and discussed by the full panel 103
104
ies.ed.gov 104
105
ies.ed.gov Help Us Help You Read the Request for Applications carefully Call or e-mail IES Program Officers early in the process As time permits, IES program staff can review draft proposals and provide feedback Don’t be afraid to contact us! 105
106
ies.ed.gov Wrap-up and Final Q&A Elizabeth Albro Associate Commissioner, Teaching and Learning, NCER Elizabeth.Albro@ed.gov (202) 219-2148 Joan McLaughlin Deputy Commissioner, NCSER Joan.McLaughlin@ed.gov (202)219-1309
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.