WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.

Slides:



Advertisements
Similar presentations
What makes a good NIHR application? 9 February 2012 Professor Jonathan Michaels.
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
Conducting Action Research in Your School. What is Action Research? “The development of powers of reflective thought, discussion, decision and action.
Protocol Development.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
8. Evidence-based management Step 3: Critical appraisal of studies
Enhancing Education Through Technology Round 9 Competitive.
Proposal Writing Workshop Features of Effective Proposals: Fellowship Track Washington, DC January 9, 2014.
Reviewing and Critiquing Research
Publishing qualitative studies H Maisonneuve April 2015 Edinburgh, Scotland.
Formative and Summative Evaluations
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
IES Grant Writing Workshop for Efficacy and Replication Projects
Observing Behavior A nonexperimental approach. QUANTITATIVE AND QUALITATIVE APPROACHES Quantitative Focuses on specific behaviors that can be easily quantified.
Business research methods: data sources
Guidelines to Publishing in IO Journals: A US perspective Lois Tetrick, Editor Journal of Occupational Health Psychology.
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
Effective proposal writing Session I. Potential funding sources Government agencies (e.g. European Union Framework Program, U.S. National Science Foundation,
RESEARCH ON EDUCATION AND LEARNING (REAL) Program Solicitation: NSF NSF Division of Research on Learning in Formal and Informal Settings.
Writing a Research Proposal
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Writing for Scholarship in Science Education: Conceptual and Methodological Issues Dana L. Zeidler Writing for Scholarship in Science Education: Conceptual.
Formulating a Research Proposal
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Too expensive Too complicated Too time consuming.
RESEARCH IN MATH EDUCATION-3
Writing research proposal/synopsis
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
OBSERVATIONAL METHODS © 2012 The McGraw-Hill Companies, Inc.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Scientific Validation Of A Set Of Instruments Measuring Fidelity Of Implementation (FOI) Of Reform-based Science And Mathematics Instructional Materials.
Is research in education important?. What is the difference between Qualitative and Quantitative Research Methods?
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Quantitative and Qualitative Approaches
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Scientifically-Based Research What is scientifically-based research? How do evaluate it?
Curriculum Development A commitment to continuous improvement… Students and Learning It’s a people business… and it centers on…
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
OBSERVATIONAL METHODS © 2009 The McGraw-Hill Companies, Inc.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Ies.ed.gov Connecting Research, Policy and Practice Kristen Rhoads, Ph.D. National Center for Special Education Research Presentation to Single-Case Intervention.
Programme Grants for Applied Research and Programme Development Grants Programmes Supporting a successful application September 2014.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
NIHR Themed Call Prevention and treatment of obesity Writing a good application and the role of the RDS 19 th January 2016.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Preparing for the Title III Part F STEM Competition Alliance of Hispanic Serving Institutions Educators Grantsmanship Institute March 20, 2016.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Making Grant Writing Successful Dara O’Neil Georgia Institute of Technology 26 October 2000.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
An Analysis of D&I Applications
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Board on science education
Writing a sound proposal
The NSF Grant Review Process: Some Practical Tips
Writing Chapter 3 and Locating Instruments
Grants Academy Session Three
Identifying Inquiry and Stating the Problem
Tips for Writing Proposals
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Writing Chapter 3 and Locating Instruments
Presentation transcript:

WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1

Goals  Encourage you to seek funding from NSF for your research.  Help you develop rigorous methodology, data collection and analysis plans that will make your proposal competitive.  Help you consider the level of detail appropriate for implementation projects. 2

Describing Your Project’s Methodology 3

Expectations for Methods in DRL 4  The DRL Programs welcome research using a variety of evidence.  The program is open to qualitative, quantitative, and mixed methods.  Methods must be rigorous and appropriate to the proposed research questions or hypotheses.  Design, methods, and analytic techniques should have a coherent and logical link.  Research methods should be described in adequate detail.

Details of Methods to Include – 1  Provide a rationale for your research design  Make it clear how the research design and analyses answer the research questions (RQs)  Include a description of study population and sampling method, sample size, expected effect size  Power analysis should inform sample size decision 5

Details of Methods to Include – 2  Instruments or protocols to be used  Validity, reliability, and triangulation of measures  Reviewers are cautious about development of new measures  Data analysis plans  Statistical Models, procedures for analysis of text/video/observation data  All of these need to have a rationale for them that connects to your RQs 6

Quantitative research  Research design (e.g. experimental, quasi- experimental and non-experimental designs, issues of internal & external validity)  Measurement (e.g. data to be collected, constructs, measures, validity & reliability of measures)  Data analysis (e.g. statistical decisions, models & procedures) 7

Qualitative Research  Identify the methodology as a systematic research design (e.g. case study, discourse analysis etc.)  Describe how and what data will be collected  Consider issues of validity, and triangulation  Include plans for analysis of textual data (coding scheme, themes etc.)  Find good balance between planned approach to analysis and flexibility to respond to findings 8

Find the Expertise You Need  Content experts are not necessarily methods experts; so partner with research methodologists  Sooner is better than later (in proposal writing stage)  Especially necessary if design is complex or you use innovative methods  Find a colleague  As co-PI or as consultant 9

Common Missteps in Methods -1  Overly generic language and description  “We will use constant comparative methods.”  “We will use HLM.”  Lack of consistent link between the theory, the RQs, the data collected, and the analyses  Reviewers will notice.  Methods and planned analyses inadequate to answer RQs.  Try developing a matrix of RQs, data/measures, and analyses – even if only for you during planning 10

Common Missteps in Methods -2  Too little or too much data without clear analysis plan  Reviewers will wonder if you understand the task.  Method is novel and not well understood in field  Needs more detail, examples and citations to justify that it is appropriate 11

Summary of Main Points  Articulate clearly your research questions or research hypotheses  Think about the most appropriate and rigorous methods to answer your research questions  Give a clear and concise description of the research methods  Include your rationale for research design decisions  Include a research methods expert in your team  Articulate clearly why your research is important and how it would contribute to theory and practice 12

Describing an Implementation 13

Details of Implementation  There are important implementation issues that need to be addressed if your project includes  Curriculum development  Professional development  Interventions 14

For All Implementation Projects 15  Consider the method(s) used to gauge the quality of the implementation  Whether as “Fidelity of Implementation” (FOI), Intended/Enacted Curriculum, or other approaches  Be specific on the STEM content, ages/grades, settings  Be clear on the roles of the team  Who will lead PD or curriculum, who will oversee implementation?  Who will collect evaluative data on implementation?

Issues for Curriculum Development  Specify the STEM content of interest and age range(s) for which you are developing curriculum  Specify the role(s) of the PI team, outside experts, participating teachers, or others  Identify the process for development, revision, and field-testing  Provide justification for the design process you will use  Make sure the measures match the materials/curriculum under development 16

Issues for Professional Development  Be specific on the professional development (PD)  STEM content, grades, and school settings  Role(s) of the PI team, outside experts, participating teachers, or others  Format of professional development (e.g., online, workshops)  Duration and location of PD  Evaluation  Identify the model for PD you will use  Train-the-trainer  Master teacher  Professional Learning Community  Provide justification for the model, the format, and your team’s expertise 17

Issues for Intervention  Describe development history and its prior use  Provide evidence, if any, for intervention’s potential effects  Describe in detail:  Population and sample;  Setting, duration, and content;  Design process, if the intervention will be revised iteratively 18

Consider Generalizability 19  If you are developing a new curriculum/PD model:  How will the intervention, curriculum, or the professional development developed in your setting apply to new settings that may differ from the study?  If you are applying an intervention, PD model, or curriculum adopted from another setting:  How well does that intervention apply to your setting?  Will promising prior results be replicable in this project?

Evaluation Plan 20  Evaluation should be useful for improving the research project  Design and content of the plan should be appropriate to what would enhance or benefit the project  Formative or summative, internal or external may be appropriate, depending on the project.  For example, advisory committees are appropriate for the evaluation of projects.  Go to specific session on Project and Program Evaluation later in the conference for more details.

Don’t be shy. Any Questions?? 21

THANK YOU! Feel free to contact Kusum Singh for follow-up and tips for finding a good methodologist: 22