John Sutton Principal Investigators Meeting – MSP FY 13 Washington, DC December 16, 2013.

Slides:



Advertisements
Similar presentations
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
World’s Largest Educational Community
Providing On-going Support for STEM Teachers Joan D. Pasley Horizon Research, Inc.
UNSW Strategic Educational Development Grants
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
How Logical Is Your Logic Model
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Title I Needs Assessment and Program Evaluation
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
Professional Growth= Teacher Growth
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Evaluating NSF Programs
Bibb County Schools Standard 1: Vision and Purpose Standard: The system establishes and communicates a shared purpose and direction for improving.
John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
An Electronic Learning Network Joni FalkBrian Drayton Brian This site is supported by the National.
Leading Change Through Differentiated PD Approaches and Structures University-District partnerships for Strengthening Instructional Leadership In Mathematics.
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Leadership Team Meeting March 24,  Project Based Approach  Cross Functional Project Teams  Projects Support Multiple Operational Expectations.
Research Indicators for Sustaining and Institutionalizing Change CaMSP Network Meeting April 4 & 5, 2011 Sacramento, CA Mikala L. Rahn, PhD Public Works,
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Getting Started With Your ATE Evaluation ATE PI Conference October 24, 2012 This material is based upon work supported by the National Science Foundation.
AdvancED District Accreditation Process © 2010 AdvancED.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Evaluation Plan New Jobs “How to Get New Jobs? Innovative Guidance and Counselling 2 nd Meeting Liverpool | 3 – 4 February L Research Institute Roula.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Distinguished Educator Initiative. 2 Mission Statement The Mission of the Distinguished Educator is to build capacity in school districts to enable students.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Program Evaluation Presenter: Dr. Laura R. Dawson Executive Vice President/COO The Dawson Group of Virginia, Inc.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Science Department Draft of Goals, Objectives and Concerns 2010.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Kathy Corbiere Service Delivery and Performance Commission
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
WP6 – Dissemination Project Name: Enhancing Students Participation in Quality Assurance in Armenian HE- ESPAQ Ref TEMPUS BE-TEMPUS-SMGR.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Math and Science Partnership National Science Foundation MSP Project Description FY’06 Institute Partnerships  Vision, Goals and Outcomes  Vision, Goals.
Background CPRE brings together education experts from renowned research institutions to contribute new knowledge that informs K- 16 education policy &
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
© 2012, Community Training and Assistance Center © 2012, Teaching Learning Solutions Linking ISLLC and your Principal Rubrics to a Case.
School practice Dragica Trivic. FINDINGS AND RECOMMENDATIONS FROM TEMPUS MASTS CONFERENCE in Novi Sad Practice should be seen as an integral part of the.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
MSP Summary of First Year Annual Report FY 2004 Projects.
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
World’s Best Workforce (WBWF)
Clinical Practice evaluations and Performance Review
MUHC Innovation Model.
BUMP IT UP STRATEGY in NSW Public Schools
Support for the AASHTO Committee on Planning (COP) and its Subcommittees in Responding to the AASHTO Strategic Plan Prepared for NCHRP 8-36, TASK 138.
2018 OSEP Project Directors’ Conference
Presentation transcript:

John Sutton Principal Investigators Meeting – MSP FY 13 Washington, DC December 16, 2013

Any opinions, suggestions, and conclusions or recommendations expressed in this presentation are those of the presenter and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content. This project is funded through the NSF Research and Technical Assistance (RETA) program (DRL ).

Professional Learning Network for Mathematics and Science Partnership Projects Learn and Share: Challenges and Successes Improve Skills Engage in Reflective Evaluation

The Goal of the TEAMS project is to: Strengthen the quality of MSP project evaluations and build the capacity of the evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting.

Promoting MSP Effectiveness Through Evaluation MSP projects represent a major federal effort to support advancements in science, technology, engineering, and mathematics (STEM) disciplines and careers. Recognizing the vital role of evaluation in this national effort to promote STEM disciplines and careers, NSF MSP projects have an obligation to ensure their project evaluation is designed and conducted in a rigorous manner.

Regardless of funding sources, project evaluation plays a vital role in every Mathematics and Science Partnership (MSP) project by: Promoting MSP Effectiveness Through Evaluation Assessing the degree to which projects attain their goals and objectives; Advancing the field by sharing lessons learned and evaluation findings; and Improving the overall effectiveness of the project through formative evaluation.

Promoting MSP Effectiveness Through Evaluation Fosters increased understanding of evaluation design and implementation, in particular new and innovative methodologies. Promotes the use of longitudinal data systems in MSP evaluations. Strengthens the role of evaluation as a means of improving project effectiveness and contributing to the knowledge of the field. Technical Evaluation Assistance in Mathematics and Science (TEAMS):

Meeting the Needs of MSP Evaluation Works closely with the NSF staff to develop and implement strategies to encourage innovation and increased rigor in MSP evaluations. Conducts ongoing needs assessment to identify issues that pose challenges for the work of evaluators of MSP projects. Offers no-cost technical assistance to address these issues and challenges. Provides venues for MSP evaluators and project staff to share strategies and findings from MSP evaluations. Technical Evaluation Assistance in Mathematics and Science (TEAMS)

Meeting the Needs of MSP Evaluation Evaluation Approaches Often, external evaluations provide: – Formative feedback to improve projects and suggest mid-course corrections – Summative reporting of project outcomes and impacts – Project monitoring for accountability

Resources to Inform Evaluation Institute of Education Sciences, U.S. Department of Education, and National Science Foundation. (2013). Common Guidelines for Education Research and Development. Washington, DC: IES and NSF. Frechtling, J. (2010) User-Friendly Handbook for Project Evaluation. REC Arlington, VA: National Science Foundation

Resources to Inform Evaluation Heck, D.J. & Minner, D.D. (2010). Technical report: Standards of evidence for empirical research, math and science partnership knowledge management and dissemination. Chapel Hill, NC: Horizon Research, Inc. Guthrie, Wamae, Diepeveen, Wooding, & Grant. (2013). Measuring research: A guide to research evaluation frameworks and tools. RAND Europe.

Meeting the Needs of MSP Evaluation Research Types TypesDescription  Foundational Research  Early-state or Exploratory Research  Design and Development Research  Efficacy Research  Effectiveness Research  Scale-Up Research Each of these types of research have different evaluation purposes and require different types of evaluation approaches.

Meeting the Needs of MSP Evaluation Measuring Research: Key Rationales Advocacy Demonstrate the benefits of supporting research, enhance understanding of research and its processes among policymakers and the public, and make the case for policy and practice change. Accountability Show that money and other resources have been used efficiently and effectively, and to hold researchers accountable. Analysis Understand how and why research is effective and how it can be better supported, feeding into research strategy and decision-making by providing a stronger evidence base. Allocation Determine where best to allocate funds in the future, making the best possible use of limited funding.

Meeting the Needs of MSP Evaluation Standards of Evidence Specify indicators for empirical evidence in six domains: 1.Adequate documentation 2.Internal validity 3.Analytic precision 4.Generalizability/external validity 5.Overall fit 6.Warrants for claims

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Challenge Posed for Each Aspect of Evaluation  Instrumentation (38%)  Theory of Action and Logic Model (27%)  Establishing Comparison Groups (24%)  Evaluation Design (24%)  Sampling (19%)  Measurable Outcomes and Evaluation Questions (19%)  Data Analysis Methodology (16%)  Data Collection (16%)  Reporting (14%)

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Other Evaluation Challenges  Instruments o Instruments for Science and Engineering o Instruments Aligned to State Standards o Instruments Aligned to Content of MSP  Valid and Reliable Performance Tasks  Classroom Observation Protocols

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Where Additional Assistance Needed  Comparison Groups in Rural Settings  Random Groups/Comparison Groups  Large Enough Sample Size/Strategies for Random Selection  Evaluation Design and Measurable Outcomes for New Projects  Data Collection/Statewide Task  Excessive Evaluation of Students and Teachers

Meeting the Needs of MSP Evaluation Strategic Plan Tasks  Task 1: Intranet – Project Internal Storage and Retrieval Structure  Task 2: Website – teams.mspnet.org  Task 3: Outreach – Ongoing Communications  Task 4: National Advisory Board – Guidance and Review  Task 5: Help Desk – Quick response to Queries  Task 6: Document Review – Identify commonalities – develop resources  Task 7: Webinars – Topics to Inform

Meeting the Needs of MSP Evaluation Strategic Plan Tasks  Task 8: Communities of Practice – Guided discussions around evaluation topics  Task 9: Direct Technical Assistance - Strategies and activities at the project level  Task 10: National Conferences – Presentations to inform others work  Task 11: Annual Meeting – Focus on Evaluation  Task 12: Data Sources – Information about data sets and utility  Task 13: Instrument Review – share information about what is being used, by whom, and for what

Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Task 3: Outreach Principal Investigators receive TEAMS communications to know what is available regarding resources and technical assistance. Identify additional resources, templates, processes, and measures being used by project for sharing with other MSP project PIs and evaluators. Communicate with TEAMS regarding specific project needs for information and technical assistance. Task 5: Help Desk Encourage project staff and evaluators to pose queries for TEAMS to respond. Task 6: Document Review Based on PI review of reports, especially challenges identified by evaluator, contact TEAMS staff for follow- up resources or technical assistance.

Meeting the Needs of MSP Evaluation Task 7: Webinars Invitations sent to PIs and evaluators to participate in webinars. Identify topics for which webinars can be prepared and provided and communicate that to TEAMS. Encourage your evaluator and project staff to present/participate in offered webinars. Task 8: Communities of Practice Based on PI review of reports, especially challenges and needs identified by individual project, recommend possible topics to TEAMS staff. Consider participation and encourage project staff and evaluator to participate in discussions. Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation Task 9: Direct Technical Assistance Based on insights and familiarity with individual project, including review of reports, contact TEAMS staff for follow-up with specific technical assistance and resources. Identify Evaluation topics for which technical assistance could be provided to project staff and evaluators. Task 10: National Conferences Share information with TEAMS about upcoming presentations from your project, especially if related to evaluation. TEAMS staff could help post presentations to share interesting findings from project. Principal Investigator Needs and Assistance

Tier Definitions TierGroup DescriptionServices 1Evaluators and researchers of projects other than NSF- and ED-funded MSP projects Access to website that provides links to available evaluation research and resources, research briefs, and other TEAMS publications 2Evaluators of NSF- and ED- funded MSP projects and external evaluators of other projects Help Desk services (Task 5) Webinars (Task 7) Communities of practice (Task 8) 3Evaluators of NSF-funded MSP projects Annual Conference (Task 11) 4Evaluators of NSF-funded MSP projects that are confronting specific challenges Communities of practice specifically for Tier 4 projects with common needs (Tasks 8 & 9) Direct technical assistance (Task 9)

Meeting the Needs of MSP Evaluation Task 11: TEAMS Annual Meeting Help identify changes in project staff Help identify specific projects to highlight and participate Help promote participation in meetings (allow resources to be used for this purpose) Task 12: Data Sources Identify projects that are using public databases in their reporting Share information about projects asking about use of public databases Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation Task 13: Instrument Review Contact TEAMS with queries regarding specific instruments for specific use. Share information with TEAMS regarding challenges encountered regarding instruments. Identify and share unique instruments being used in project. Consider using instruments from other projects as appropriate. Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation In Summary, Principal Investigators can: Identify needs; Share information between projects and TEAMS; Encourage involvement; Facilitate communication; and Promote high quality evaluation approaches. Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation Website ( and Help Desk

Meeting the Needs of MSP Evaluation Website ( and Help Desk

Meeting the Needs of MSP Evaluation Ongoing Needs Assessment At your tables, please write down one or two anticipated evaluation challenges and/or needs that your project perceives it may need assistance related to project/program evaluation.

What Questions Do You Have Regarding TEAMS? TEAMS contact information: teams.mspnet.org Meeting the Needs of MSP Evaluation

TEAMS Contacts John T. Sutton, PIDave Weaver, Co-PI RMC Research Corporation th Street Suite 2100 Denver, CO RMC Research Corporation 111 SW Columbia Street Suite 1030 Portland, OR Phone: Toll Free: Fax: Phone: Toll Free: Fax: