A cross-agency framework that describes:  Broad types of research and development  The expected purposes, justifications, and contributions of various.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Edith Gummer MSP Conference 12/16/2013. Knowledge development is not linear, but multi- directional Scientific contributions by multiple researchers,
Victorian Curriculum and Assessment Authority
Demonstration of sustained effective leadership in academic practice and academic development Evidence of impact on high quality student learning Draw.
Working with the Teachers’ Standards in the context of ITE. Some key issues for ITE Partnerships to explore.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Session 5 Intellectual Merit and Broader Significance FISH 521.
 Introductions  Webinar etiquette ◦ Please place your phone on MUTE if you are not asking a question or not responding to the presenters. ◦ If you encounter.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
NSF Research Proposal Review Guidelines. Criterion 1: What is the intellectual merit of the proposed activity? How important is the proposed activity.
NSF Merit Review Criteria Revision Background. Established Spring 2010 Rationale: – More than 13 years since the last in-depth review and revision of.
Consistency of Assessment
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
ACADEMIC INFRASTRUCTURE Framework for Higher Education Qualifications Subject Benchmark Statements Programme Specifications Code of Practice (for the assurance.
PPA 502 – Program Evaluation
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Standards and Guidelines for Quality Assurance in the European
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Internal Auditing and Outsourcing
RESEARCH ON EDUCATION AND LEARNING (REAL) Program Solicitation: NSF NSF Division of Research on Learning in Formal and Informal Settings.
2014 AmeriCorps External Reviewer Training
NSF-EHR Framework for Improving Science, Technology, Engineering and Mathematics Education R & D Ferdie Rivera Program Director (Part time; Intermittent.
Directorate for Education and Human Resources (EHR) EHR Core Research Program (ECR) Program Announcement: NSF
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Argumentation in Middle & High School Science Victor Sampson Assistant Professor of Science Education School of Teacher Education and FSU-Teach Florida.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Northcentral University The Graduate School February 2014
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
HECSE Quality Indicators for Leadership Preparation.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
Committee on the Assessment of K-12 Science Proficiency Board on Testing and Assessment and Board on Science Education National Academy of Sciences.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Morten Blomhøj and Paola Valero Our agenda: 1.The journal NOMAD’s mission, review policy and process 2.Two reviews of a paper 3.Frequent comments in reviews.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Kathy Corbiere Service Delivery and Performance Commission
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
N ational Q ualifications F ramework N Q F Quality Center National Accreditation Committee.
Office of Service Quality
Preparation Plan. Objectives Describe the role and importance of a preparation plan. Describe the key contents of a preparation plan. Identify and discuss.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Data Infrastructure Building Blocks (DIBBS) NSF Solicitation Webinar -- March 3, 2016 Amy Walton, Program Director Advanced Cyberinfrastructure.
RESEARCH ON EDUCATION AND LEARNING (REAL) Program Solicitation: NSF NSF Division of Research on Learning in Formal and Informal Settings.
Informal Federal Interagency Workgroup on a Common Evidence Framework DRAFT-April 2015.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Teaching and Learning Division National Center for Education Research.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Stages of Research and Development
EVALUATING EPP-CREATED ASSESSMENTS
Presented by Deborah Eldridge, CAEP Consultant
Writing a sound proposal
MUHC Innovation Model.
Research Program Strategic Plan
Measuring Data Quality and Compilation of Metadata
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

A cross-agency framework that describes:  Broad types of research and development  The expected purposes, justifications, and contributions of various types of research to knowledge generation about interventions and strategies for improving learning 3

 The American education system needs research to produce stronger evidence at a faster pace  More constrained federal resources demand that NSF and ED purposefully build on each other’s research and development portfolios  A cross-agency vocabulary and set of research expectations is critical for effective communication 4

 Is not strictly linear; three categories of educational research – core knowledge building, design & development, and studies of impact – overlap  Requires efforts of researchers and practitioners representing a range of disciplines and methodological expertise 5  May require more studies for basic exploration and design than for testing the effectiveness of a fully-developed intervention or strategy  Requires assessment of implementation—not just estimation of impacts  Includes attention to learning in multiple settings (formal and informal)

6 Six Basic Types of Educational Research and Development

7 Foundational Research Fundamental knowledge that may contribute to improved learning & other education outcomes Studies of this type: ◦ Test, develop or refine theories of teaching or learning ◦ May develop innovations in methodologies and/or technologies that influence & inform research & development in different contexts

8 Early-Stage or Exploratory Research  Examines relationships among important constructs in education and learning  Goal is to establish logical connections that may form the basis for future interventions or strategies intended to improve education outcomes  Connections are usually correlational rather than causal

9 Design and Development Research  Draws on existing theory & evidence to design and iteratively develop interventions or strategies ◦ Includes testing individual components to provide feedback in the development process  Could lead to additional work to better understand the foundational theory behind the results  Could indicate that the intervention or strategy is sufficiently promising to warrant more advanced testing

10 Studies of Impact generate reliable estimates of the ability of a fully-developed intervention or strategy to achieve its intended outcomes  Efficacy Research tests impact under “ideal” conditions  Effectiveness Research tests impact under circumstances that would typically prevail in the target context  Scale-Up Research examines effectiveness in a wide range of populations, contexts, and circumstances

11 Organization of the Guidelines

12 Purpose How does this type of research contribute to the evidence base? Justification How should policy and practical significance be demonstrated? What types of theoretical and/or empirical arguments should be made for conducting this study?

13 Outcomes Generally speaking, what types of outcomes (theory and empirical evidence) should the project produce? Research Plan What are the key features of a research design for this type of study?

14

15 External Feedback Plan Series of external, critical reviews of project design and activities Review activities may entail peer review of proposed project, external review panels or advisory boards, a third party evaluator, or peer review of publications External review should be sufficiently independent and rigorous to influence and improve quality

16  A clear description of the practical education problem and a compelling case that the proposed research will inform the development, improvement, or evaluation of education programs, policies, or practices  A strong theoretical and empirical rationale for the project, ideally with citations to evidence Exploratory/ Early Stage Research

17  A clear description of the practical problem and the initial concept for the planned investigation, including a well- explicated logic model  In the logic model, identification of key components of the approach, a description of the relationships among components, and theoretical and/or empirical support  Explanation of how the approach is different from current practice and why it has the potential to improve learning. Design and Development Research

18  Clear description of the intervention/ strategy and the practical problem it addresses; how intervention differs from others; and connection to learning  Empirical evidence of promise from a Design and Development pilot study, or support for each link in the logic model from Exploratory/Early Stage research, or evidence of wide use  Justification for examining impact under ideal circumstances, rather than under routine practice conditions Efficacy Research

19  Empirical evidence regarding associations between malleable factors and education or learning outcomes  A conceptual framework supporting a theoretical explanation for the malleable factors’ link with the education or learning outcomes  A determination, based on the empirical evidence and conceptual framework, of whether Design and Development research or an Efficacy study is warranted, or whether further Foundational or Exploratory/Early-Stage research is needed Exploratory/ Early Stage

20  A fully-developed version of the intervention or strategy  A well-specified logic model  Descriptions of the major design iterations, resulting evidence, and adjustments to logic model  Measures and data demonstrating project’s implementation success  Pilot data on the intervention’s promise for generating the intended outcomes. Design and Development Research

21  Detailed descriptions of the study goals, design and implementation, data collection and quality, and analysis and findings  Implementation documented in sufficient detail to judge applicability of the study findings; when possible, relate these factors descriptively to the impact findings  Discussion of the implications of the findings for the logic model and, where warranted, make suggestions for adjusting the logic model to reflect the study findings Efficacy Research

 Guidelines will inform decision-making for agencies (individually and jointly) across different topic areas ◦ Analyze the developmental status of awards and progress within various portfolios ◦ Identify areas of education research and development needing additional resources/emphasis ◦ Encourage more and better research on the development, implementation, and scaling of new strategies and interventions 22

 Guidelines provide guidance regarding what high- quality research design looks like ◦ Gives reviewers a tool to assess the quality of the research design (for individual proposals and across a group of proposals) ◦ Support reviewers in their role as “critical friends” who offer actionable feedback to PIs ◦ Help ensure that agencies fund robust research and development efforts 23

 Guidelines can help PIs conceptualize & communicate how the proposed research & development fits into a broader evidence-building agenda ◦ Suggest components to include, within a single proposal and a given type of research ◦ Identify important considerations in planning an project, including building the research team ◦ Establish expectations about needed improvements in how we—as a field—develop, conduct, and apply research and scale effective practices 24

 Guidelines can help practitioners develop a better understanding of what different stages of education research should address and might be expected to produce ◦ Helps practitioners understand what to expect from different types of research findings ◦ Supports more informed decisions based on the level of evidence ◦ Provides a shared sense of what is needed as practitioners engage with researchers to improve education practices 25

The Joint Committee began meeting in January 2011 with representatives from both agencies. Co-Chairs: Janice Earle, NSF (EHR) and Rebecca Maynard, ED (Institute of Education Sciences, ; Ruth Curran Neild, ED (Institute of Education Sciences, ) Ex Officio: Joan Ferrini-Mundy Assistant Director, NSF (EHR) and John Easton, Director, Institute of Education Sciences Members:  ED: Elizabeth Albro, Joy Lesnick, Ruth Curran Neild, Lynn Okagaki, Anne Ricciuti, Tracy Rimdzius, Allen Ruby, Deborah Speece (IES); Karen Cator, Office of Education Technology; Michael Lach, Office of the Secretary; Jefferson Pestronk, Office of Innovation and Improvement  NSF: Jinfa Cai, Gavin Fulmer, Edith Gummer (EHR-DRL); Jim Hamos (EHR-DUE); Janet Kolodner (CISE and EHR-DRL); Susan Winter (SBE) 26

 FAQs for the Common Guidelines NSF pdf Point of contact for NSF: Janice Earle 27

 These Guidelines are most relevant for NSF programs and projects that undertake education research and development activities. Some solicitations will explicitly reference the Guidelines.  Not intended for outreach or scholarship activities. 28

 The Guidelines are not intended to supplant, but are consistent with the Merit Review criteria. ◦ One element of the intellectual merit criterion for proposals is whether the project can advance knowledge and understanding. ◦ In addition, the intellectual merit criterion calls for a well-reasoned, well organized plan based on a sound rationale along with a mechanism to assure success. 29

 No. The Guidelines do not preclude or favor any research methods, but they do underscore the importance of ensuring that the methods are well described, justified, and appropriate to the research questions that are posed.  Qualitative and quantitative approaches may be used in all of the six research genres that are described in the Guidelines. 30

 Reviewers will be informed of the Guidelines through multiple approaches. The Guidelines will be: ◦ posted on the NSF website, ◦ referred to in program solicitations, ◦ discussed in reviewer webinars and orientations, and ◦ presented at PI and other professional meetings. 31

 The Guidelines include recommendations for all types of studies that call for external feedback on the work being proposed. ◦ External feedback can include a number of approaches including third party evaluation, program officer evaluation, and/or regular feedback from advisory groups. ◦ It is up to the proposer to explain (a) why the external feedback planned is appropriate, and (b) how it is aligned with program requirements. 32

 No. The Guidelines are intended to help PIs in proposal preparation. The key point of the Guidelines is to ensure that projects are explicit about their research questions, methods and analytic approaches in their proposals. These criteria should be relevant for all types of education R&D efforts. 33

 Guidelines apply to proposals, but they foreshadow what will come from the research and development effort.  Each section of the Guideline is connected to evidence of some aspect of the proposal and the proposed work.  Throughout the Guidelines are explicit and implicit messages about what counts as evidence and what needs to be considered. 34

Claim Data/ Evidence Warrant Justification – What types of theoretical and/or empirical arguments should be made for conducting this study? Outcomes - measures with evidence of technical quality Purpose – new or improved interventions or strategies to achieve well-specified learning goals or objectives

36 6/nsf13126.pdf?WT.mc_id=USNSF_124 Common Guidelines for Education Research and Development: FAQ’s for Common guidelines 27/nsf13127.pdf Edith Gummer