John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
1 Department of State Program Evaluation Policy Overview Spring 2013.
Donald T. Simeon Caribbean Health Research Council
Providing On-going Support for STEM Teachers Joan D. Pasley Horizon Research, Inc.
What You Will Learn From These Sessions
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
Family Resource Center Association January 2015 Quarterly Meeting.
How Logical Is Your Logic Model
Comprehensive M&E Systems
PPA 502 – Program Evaluation
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Evaluation. Practical Evaluation Michael Quinn Patton.
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
Professional Growth= Teacher Growth
How to Develop the Right Research Questions for Program Evaluation
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Evaluating NSF Programs
Reporting and Using Evaluation Results Presented on 6/18/15.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
John Sutton Principal Investigators Meeting – MSP FY 13 Washington, DC December 16, 2013.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Science Department Draft of Goals, Objectives and Concerns 2010.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Student Learning Objectives 1 SCEE Summit Student Learning Objectives District Professional Development is the Key 2.
Kathy Corbiere Service Delivery and Performance Commission
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Stages of Research and Development
Clinical Practice evaluations and Performance Review
MUHC Innovation Model.
Melanie Taylor Horizon Research, Inc.
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013

Any opinions, suggestions, and conclusions or recommendations expressed in this presentation are those of the presenter and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content. This project is funded through the NSF Research and Technical Assistance (RETA) program (DRL ).

Professional Learning Network for Mathematics and Science Partnership Projects Learn and Share: Challenges and Successes Improve Skills Engage in Reflective Evaluation

The Goal of the TEAMS project is to: Strengthen the quality of MSP project evaluations and build the capacity of the evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting.

Promoting MSP Effectiveness Through Evaluation MSP projects represent a major federal effort to support advancements in science, technology, engineering, and mathematics (STEM) disciplines and careers. Recognizing the vital role of evaluation in this national effort to promote STEM disciplines and careers, NSF MSP projects have an obligation to ensure their project evaluation is designed and conducted in a rigorous manner.

Regardless of funding sources, project evaluation plays a vital role in every Mathematics and Science Partnership (MSP) project by: Promoting MSP Effectiveness Through Evaluation Assessing the degree to which projects attain their goals and objectives; Advancing the field by sharing lessons learned and evaluation findings; and Improving the overall effectiveness of the project through formative evaluation.

Promoting MSP Effectiveness Through Evaluation Fosters increased understanding of evaluation design and implementation, in particular new and innovative methodologies. Promotes the use of longitudinal data systems in MSP evaluations. Strengthens the role of evaluation as a means of improving project effectiveness and contributing to the knowledge of the field. Technical Evaluation Assistance in Mathematics and Science (TEAMS):

Meeting the Needs of MSP Evaluation Works closely with the NSF staff to develop and implement strategies to encourage innovation and increased rigor in MSP evaluations. Conducts ongoing needs assessment to identify issues that pose challenges for the work of evaluators of MSP projects. Offers no-cost technical assistance to address these issues and challenges. Provides venues for MSP evaluators and project staff to share strategies and findings from MSP evaluations. Technical Evaluation Assistance in Mathematics and Science (TEAMS):

Meeting the Needs of MSP Evaluation Evaluation Approaches Often, external evaluations provide: – Formative feedback to improve projects and suggest mid-course corrections – Summative reporting of project outcomes and impacts – Project monitoring for accountability

Resources to Inform Evaluation Institute of Education Sciences, U.S. Department of Education, and National Science Foundation. (2013). Common Guidelines for Education Research and Development. Washington, DC: IES and NSF. Frechtling, J. (2010) User-Friendly Handbook for Project Evaluation. REC Arlington, VA: National Science Foundation

Resources to Inform Evaluation Heck, D.J. & Minner, D.D. (2010). Technical report: Standards of evidence for empirical research, math and science partnership knowledge management and dissemination. Chapel Hill, NC: Horizon Research, Inc. Guthrie, Wamae, Diepeveen, Wooding, & Grant. (2013). Measuring research: A guide to research evaluation frameworks and tools. RAND Europe.

Meeting the Needs of MSP Evaluation Research Types TypesDescription  Foundational Research  Early-state or Exploratory Research  Design and Development Research  Efficacy Research  Effectiveness Research  Scale-Up Research Each of these types of research have different evaluation purposes and require different types of evaluation approaches.

Meeting the Needs of MSP Evaluation Measuring Research: Key Rationales Advocacy Demonstrate the benefits of supporting research, enhance understanding of research and its processes among policymakers and the public, and make the case for policy and practice change. Accountability Show that money and other resources have been used efficiently and effectively, and to hold researchers accountable. Analysis Understand how and why research is effective and how it can be better supported, feeding into research strategy and decision-making by providing a stronger evidence base. Allocation Determine where best to allocate funds in the future, making the best possible use of limited funding.

Meeting the Needs of MSP Evaluation Standards of Evidence Specify indicators for empirical evidence in six domains: 1.Adequate documentation 2.Internal validity 3.Analytic precision 4.Generalizability/external validity 5.Overall fit 6.Warrants for claims

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Challenge Posed for Each Aspect of Evaluation  Instrumentation (38%)  Theory of Action and Logic Model (27%)  Establishing Comparison Groups (24%)  Evaluation Design (24%)  Sampling (19%)  Measurable Outcomes and Evaluation Questions (19%)  Data Analysis Methodology (16%)  Data Collection (16%)  Reporting (14%)

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Other Evaluation Challenges  Instruments o Instruments for Science and Engineering o Instruments Aligned to State Standards o Instruments Aligned to Content of MSP  Valid and Reliable Performance Tasks  Classroom Observation Protocols

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Where Additional Assistance Needed  Comparison Groups in Rural Settings  Random Groups/Comparison Groups  Large Enough Sample Size/Strategies for Random Selection  Evaluation Design and Measurable Outcomes for New Projects  Data Collection/Statewide Task  Excessive Evaluation of Students and Teachers

Meeting the Needs of MSP Evaluation Strategic Plan Tasks  Task 1: Intranet – Project Internal Storage and Retrieval Structure  Task 2: Website – teams.mspnet.org  Task 3: Outreach – Ongoing Communications  Task 4: National Advisory Board – Guidance and Review  Task 5: Help Desk – Quick response to Queries  Task 6: Document Review – Identify commonalities – develop resources  Task 7: Webinars – Topics to Inform

Meeting the Needs of MSP Evaluation Strategic Plan Tasks  Task 8: Communities of Practice – Guided discussions around evaluation topics  Task 9: Direct Technical Assistance - Strategies and activities at the project level  Task 10: National Conferences – Presentations to inform others work  Task 11: Annual Meeting – Focus on Evaluation  Task 12: Data Sources – Information about data sets and utility  Task 13: Instrument Review – share information about what is being used, by whom, and for what

Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Task 3: Outreach Principal Investigators receive TEAMS communications to know what is available regarding resources and technical assistance. Identify additional resources, templates, processes, and measures being used by project for sharing with other MSP project PIs and evaluators. Communicate with TEAMS regarding specific project needs for information and technical assistance. Task 5: Help Desk Encourage project staff and evaluators to pose queries for TEAMS to respond. Task 6: Document Review Based on PI review of reports, especially challenges identified by evaluator, contact TEAMS staff for follow- up resources or technical assistance.

Meeting the Needs of MSP Evaluation Task 7: Webinars Invitations sent to PIs and evaluators to participate in webinars. Identify topics for which webinars can be prepared and provided and communicate that to TEAMS. Encourage your evaluator and project staff to present/participate in offered webinars. Task 8: Communities of Practice Based on PI review of reports, especially challenges and needs identified by individual project, recommend possible topics to TEAMS staff. Consider participation and encourage project staff and evaluator to participate in discussions. Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation Task 9: Direct Technical Assistance Based on insights and familiarity with individual project, including review of reports, contact TEAMS staff for follow-up with specific technical assistance and resources. Identify Evaluation topics for which technical assistance could be provided to project staff and evaluators. Task 10: National Conferences Share information with TEAMS about upcoming presentations from your project, especially if related to evaluation. TEAMS staff could help post presentations to share interesting findings from project. Principal Investigator Needs and Assistance

Tier Definitions TierGroup DescriptionServices 1Evaluators and researchers of projects other than NSF- and ED-funded MSP projects Access to website that provides links to available evaluation research and resources, research briefs, and other TEAMS publications 2Evaluators of NSF- and ED- funded MSP projects and external evaluators of other projects Help Desk services (Task 5) Webinars (Task 7) Communities of practice (Task 8) 3Evaluators of NSF-funded MSP projects Annual Conference (Task 11) 4Evaluators of NSF-funded MSP projects that are confronting specific challenges Communities of practice specifically for Tier 4 projects with common needs (Tasks 8 & 9) Direct technical assistance (Task 9)

Meeting the Needs of MSP Evaluation Task 11: TEAMS Annual Meeting Help identify changes in project staff Help identify specific projects to highlight and participate Help promote participation in meetings (allow resources to be used for this purpose) Task 12: Data Sources Identify projects that are using public databases in their reporting Share information about projects asking about use of public databases Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation Task 13: Instrument Review Contact TEAMS with queries regarding specific instruments for specific use. Share information with TEAMS regarding challenges encountered regarding instruments. Identify and share unique instruments being used in project. Consider using instruments from other projects as appropriate. Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation In Summary, Principal Investigators can: Identify needs; Share information between projects and TEAMS; Encourage involvement; Facilitate communication; and Promote high quality evaluation approaches. Principal Investigator Needs and Assistance

Meeting the Needs of MSP Evaluation Website ( and Help Desk

Meeting the Needs of MSP Evaluation Website ( and Help Desk

Meeting the Needs of MSP Evaluation Instruments Considerations Using measures of established quality vs. alignment to the specific goals/approaches of the project o Internally developed & piloted instruments o Externally developed & validated instruments o Collection & analysis of teacher work from the PD

Meeting the Needs of MSP Evaluation Instruments Benefits Internally developed instruments can help demonstrate results were what was intended and promised Externally validated instruments can help demonstrate findings are credible and more broadly important Use of multiple instruments provides triangulation of data for findings Use of internally developed instruments and teacher work samples can help in refining the program and informing providers about participants’ learning

Meeting the Needs of MSP Evaluation Instruments Lessons Learned As evaluation informs the project and the project evolves, this sometimes requires instrument changes Modifying instruments (adding and/removing items over time) and aligning data sets after modifications to keep up with evolving project needs Adding new instruments or removing instruments (when initial instrumentation isn’t providing appropriate data – i.e., teacher knowledge, etc.) Verify instrument validity and reliability after modifications and include information in reports.

Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the Project and Identify Key Evaluation Points Theory of Action  Why This/Hypothesis o Based on interpretation of current research  Describes the experience of the intended audience o Cognitively or behaviorally  Expected Outcome o If This/Then This

Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the Project and Identify Key Evaluation Points Model Components  Inputs  Activities  Outputs  Short-term Outcomes  Long-term Outcomes  Contextual Factors

Meeting the Needs of MSP Evaluation Example of Logic Model

Meeting the Needs of MSP Evaluation Develop an Evaluation Plan Steps  Determining what type of design is required to answer the questions posed  Selecting a methodological approach and data collection instruments  Selecting a comparison group  Timing, Sequencing, and Frequency of Data Collection

Meeting the Needs of MSP Evaluation Develop Evaluation Questions and Define Measurable Outcomes Steps  Identify Key Stakeholders and Audiences  Formulating potential evaluation questions of interest to the stakeholders and audiences  Defining outcomes in measureable terms  Prioritizing and eliminating questions

Meeting the Needs of MSP Evaluation Conducting the Data Collection Considerations  Obtain necessary clearances and permission.  Consider the needs and sensitivities of the respondents.  Make sure your data collectors are adequately trained and will operate in an objective, unbiased manner.  Obtain data from as many members of your sample as possible.  Cause as little disruption as possible to the ongoing effort.

Meeting the Needs of MSP Evaluation Analyzing the Data Considerations  Check the raw data and prepare them for analysis.  Conduct initial analysis based on the evaluation plan.  Conduct additional analyses based on the initial results.  Integrate and synthesize findings.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Analytic Precision IndicatorsDescription  Measurement Validity/Logic of Research Process  Reliable Measures/Trustworthy Techniques  Appropriate and Systematics Analysis The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Analytic Precision IndicatorsDescription  Unit of Analysis Issues  Power  Effect Size  Multiple Instruments  Multiple Respondents  All Results The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.

Meeting the Needs of MSP Evaluation Reporting the Findings Considerations  Background (Context, sites, intervention, etc.)  Evaluation study questions  Evaluation procedures (description of measures used and purposes)  Study Sites and Sample Demographics  Data Collection (administration, participants counts, timelines for acquiring data, etc.)  Data analyses (what methods for what measures, limitations, missing data, etc.)  Findings  Conclusions (and recommendations)

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Generalizability/External Validity IndicatorsDescription  Findings for Whom  Generalizable to population or theory  Generalizable to different contexts The extent to which you can come to conclusions about one thing (e.g., population) based on information about another (e.g., sample).

Meeting the Needs of MSP Evaluation Disseminate the Information Considerations  The funding source(s)  Potential funding sources  Others involved with similar projects or areas of research  Community members, especially those who are directly involved with the project or might be involved  Members of the business or political community, etc.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Warrants for Claims IndicatorsDescription  Limitations  Decay and Delay of the Effect  Efficacy  Conclusions/Implications Logically Drawn from Findings The extent to which the data interpretation, conclusions, and recommendations are justifiable based on the evidence presented.

Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation TopicsEvaluation Design Component  Develop logic model  Identify contextual conditions Development of a conceptual model (logic model) of the program  Articulate goals clearly  Define multiple achievement outcomes Development of evaluation questions and measureable outcomes

Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation TopicsEvaluation Design Component  Address shifting project and evaluation priorities Development of the evaluation design  Format measures (hard- copy, electronic, etc.) and schedule administration  Display data effectively  Data management Collection of data

Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation TopicsEvaluation Design Component  Conduct appropriate data analyses to respond to evaluation questions Analysis of data  Report intended impact on various populations  Report findings to different audiences Provision of information to interested audiences

Meeting the Needs of MSP Evaluation Ongoing Needs Assessment At your tables, please write down one or two anticipated evaluation challenges and/or needs that your project perceives it may need assistance related to project/program evaluation.

What Questions Do You Have Regarding TEAMS? TEAMS contact information: teams.mspnet.org Meeting the Needs of MSP Evaluation

TEAMS Contacts John T. Sutton, PIDave Weaver, Co-PI RMC Research Corporation th Street Suite 2100 Denver, CO RMC Research Corporation 111 SW Columbia Street Suite 1030 Portland, OR Phone: Toll Free: Fax: Phone: Toll Free: Fax: