Findings from the AISL Program’s Online Project Monitoring System for projects funded between FY 2006 and FY 2014 Gary Silverstein and Ashley Simpkins,

Slides:



Advertisements
Similar presentations
WV High Quality Standards for Schools
Advertisements

National Academy of Engineering of the National Academies 1 Phase II: Educating the 2020 Engineer Phase II: Adapting Engineering Education to the New Century...
The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
Some Findings from the AISL Program’s Online Project Monitoring System for Projects Funded Between FY 2006 and FY 2012 Gary Silverstein Westat August 21,
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Online Project Monitoring System (OPMS) ISE PI Meeting March 14, 2012 Sandra Toro Martell, NSF Gary Silverstein, Westat Hannah Putman, Westat Melissa Bryce,
EVALUATION FINDINGS AND STATE SUCCESS STORIES AUGUST 30, CDC Field Triage Decision Scheme Implementation Project.
An Excellent Proposal is a Good Idea, Well Expressed, With A Clear Indication of Methods for Pursuing the Idea, Evaluating the Findings, and Making Them.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
Center of Excellence in Mathematics and Science Education Cooperative Partners College of Arts and Sciences College of Education Dr. Jack Rhoton East Tennessee.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
1 Exploring NSF Funding Opportunities in DUE Tim Fossum Division of Undergraduate Education Vermont EPSCoR NSF Research Day May 6, 2008.
NRCOI March 5th Conference Call
OCW SE: Project Overview March 29, 2006 OpenCourseWare Secondary Education.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
How to Write Goals and Objectives
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Incorporated in 1973 with 20 founding science center and museum members Membership now totals 660 institutions (centers, museums, universities, research.
Museums and Exhibitions Week 6. 18,000-20,000 museums in U.S. today 3/4s of world’s museums created since 1945 From “being about something to being for.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
The Ofsted ITE Inspection Framework 2014 A summary.
EEC Board Policy and Research Committee Meeting April 7, 2014 Race to the Top Early Learning Challenge Grant (RTT-ELC)
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
MMS EPO Mission Evaluation: An external evaluation was designed to support education and public outreach by establishing criteria for successful activities,
Overview of the SPDG Competition Jennifer Doolittle, Ph.D. 1.
Museum and Community Partnerships.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Welcome! Please join us via teleconference: Phone: Code:
An Overview of the NISE Network Presentation Overview NISE Network Network Community Educational Products Evaluation and Research.
Proposed National SET Goals for 2009 National SET Mission Mandate Team and National 4-H Council.
Planning a Museum & Community Partnership Project Brown-Bag Workshop | Wednesday, September 30, Welcome (Kayla Berry) 2.Introduction (Rae Ostman)
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Why Do State and Federal Programs Require a Needs Assessment?
CIRTL Network Data Collection 3/2/2013. Institutional Portrait: Purpose Consistency with the TAR principle Accountability: – Helps us all monitor Network.
Snapshot of Project Participants as of May 2011 (Month 1 of 18) Context National economic trend to boost Science, Technology, Engineering & Mathematics.
Transforming Michigan’s Adult Learning Infrastructure.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
INTERACTIVE EXHIBITS – WHAT AND WHY? AMCV1550. (An Agenda for Museums in the 21 st Century) "Now is the time for the next great agenda of museum development.
Module II: Developing a Vision and Results Orientation Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24,
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Science Department Draft of Goals, Objectives and Concerns 2010.
Project Design Jennifer Coffey OSEP May 4,
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Opening Doors to the Future Gateway Engineering Education Coalition Measuring Culture Change in Engineering Education Eli Fromm,
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
ISLLC Standard #6 ISLLC Standard #6 Implementing Educational Policy Name Workshop Facilitator.
Office of Service Quality
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
2010 NATIONAL EDUCATION TECHNOLOGY PLAN Eva Perez EDIT 654 OL.
NSF INCLUDES Inclusion Across the Nation of Learners of Underrepresented Discoverers in Engineering and Science AISL PI Meeting, March 1, 2016 Sylvia M.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Evaluation of NSF’s Research and Evaluation on Education in Science and Engineering (REESE) program Challenges and Strategies Joy Frechtling, Westat John.
Occur when a societal condition is improved due to a participant’s action taken in the previous column. \ -Increased number and more diverse pool of youth.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Research Opportunities in AMSP UK Mathematics Education Retreat October 15, 2005.
NSDL: A New Tool for Teaching and Learning.
Innovative education and new skills to increase engagement in Science
Presentation transcript:

Findings from the AISL Program’s Online Project Monitoring System for projects funded between FY 2006 and FY 2014 Gary Silverstein and Ashley Simpkins, Westat March 2, 2016

Overview of the AISL OPMS Web-based monitoring system What is it? NSF Westat Who developed it? Implemented in 2006 More comprehensive data for projects funded since FY 2009 When did it start? Baseline (anticipated activities and accomplishments) Annual (project activities and reach for previous calendar year) Closeout (project accomplishments over the entire grant) What does it collect? 2

The OPMS is comprised of three surveys Lead organization, key personnel, and partners Information about each project deliverable Characteristics of anticipated audiences Anticipated reach and impact Study designs and data collection methods Research questions and related study designs (new item) Baseline data (anticipated) Update baseline data (e.g., add new key personnel) Actual number reached Extent to which anticipated impacts were attained Findings related to research questions (new item) Challenges encountered and lessons learned Upload products (e.g., surveys, logic models) Annual/Closeout data (actual) 3

How does NSF use these data? 4 Assess the implementation and reach of the AISL program Examine project and program trends over time Examine outcomes for specific deliverable types Contribute to the evaluation of NSF’s ISE program Monitor progress of individual AISL grants Answer stakeholder questions in a timely manner

Caveats about the data we are presenting Wherever possible, we report findings for all cohorts funded since FY 2006 More detailed information is available for projects funded since FY 2009 Full-scale Development Broad Implementation Connecting Researchers and Public Audiences Research in Service to Practice Innovations in Development Findings we present today are for the following award types Pathways and Planning grants Conferences EAGER and RAPID We only collect limited data for other award types 5

Characteristics of Organizations that Participate in the AISL Program

Characteristics of lead organizations (FY ; 298 projects) 7

Type of partner organizations used by AISL projects (FY ; 298 projects) 8

Partner organizations for projects led by a 4- year college/university (FY ; 82 projects) 9

Venues projects used to reach audiences in 2014 (n=1,039 venues across 81 projects) Partner organization typeNumberPercent Science-technology center or museum Public pre-K ‒ 12 district/school year college or university Children’s museum858.2 Zoo or aquarium565.4 Natural history museum424.0 Library373.6 Community organization353.4 Nature or interpretive center353.4

Distribution of all venues that projects used to reach audiences in 2014 (across 81 projects)

Distribution of museums that projects used to reach audiences in 2014

Distribution of K-12 schools that projects used to reach audiences in 2014

Distribution of colleges and universities that projects used to reach audiences in 2014

Distribution of theaters that projects used to reach audiences in 2014

Distribution of restaurants that projects used to reach audiences in 2014

Public Audience Deliverables

Types of deliverables projects used to reach public audiences in 2014 (77 projects) 18

Deliverables used to reach public audiences in 2014 (207 deliverables across 77 projects) Deliverable typeNumberPercent Museum exhibit (permanent, temporary, or traveling) After school or summer program for youth Group-oriented program136.3 Demonstration/activity kit/guide136.3 Game94.3 Festival or other one-time/annual event83.9 Video segment/clip/program/series73.4 Television segment/episode/program/series52.4

Total and median number of public audience members reached in 2014 (unduplicated count, 48 projects) Deliverable typeTotalMedian Total number of users/listeners/viewers28,168, Audio or video19,493, Games and information communication technologies 2,407,4765,331 Exhibits1,248,42794,951 Program, events, and activities281, Resource materials and information sharing155, Infrastructure development128, Project website2,454,01257,614

Age groups projects anticipated targeting with their public audience deliverables (FY ; 264 projects) 21

Audiences underrepresented in STEM that projects anticipated targeting with their public audience deliverables (FY ; 264 projects) 22

Strategies projects anticipated using to target a specific audience (FY ; 163 projects) StrategyNumberPercent Outreach or marketing to specific target audiences Content developed for a specific audience Activity specifically designed to be accessible to a target audience (e.g., lack access to necessary equipment) Outreach to school groups/programs Location allows for easy targeting to specific audiences Involve people from target groups in project design2917.8

Impacts that projects anticipated for their public audiences (FY ; 227 projects) 24

Professional Audience Deliverables

Types of deliverables that projects used to reach professional audiences in FY 2014 (66 projects) 26

Professional audiences that projects reached in 2014 (66 projects) 27

Total and median number of professional participants reached in 2014 (unduplicated count, 40 projects) Deliverable typeTotalMedian Total number of professional audience participants88,46350 Informal educators61,68730 Pre-K ‒ 12 teachers 9,53423 Post-secondary instructors2,77225 Exhibit designers1,0867 Scientists, engineers and/or scientists69618 Staff at after-school and youth programs59012 Staff at community programs1989

Impacts that projects anticipated for their professional audiences (FY ; 202 projects) 29

Research Questions

Distribution of research questions across AISL award types (33 projects funded in FY 2013 and FY 2014) 31

Sample research questions  What student-level, teacher-level, and school-level factors contribute to or inhibit students gains in SEP mastery and/or their interest in science?  How are instructional practices in STEM summer programs related to perceptions of challenge, relevance, learning, and affect for participating youth?  Are situational (momentary) interest and engagement in STEM activities across several weeks associated with changes in: (a) individual (sustained) interest in STEM; (b) a STEM self-concept; and (c) future goals and aspirations related to STEM?  How are evaluations used in relation to Science Festivals and how does evaluation use change within the context of a community of practice that creates its own multisite evaluation?  What contextual factors influence the nature of staff-facilitated family mathematical discourse?  How can family mathematical discourse and socio-mathematical norms at exhibits be operationalized and measured?

Study designs to be used to examine research questions (141 research questions across 33 projects)

Exploring the OPMS Database

We can parse the database to explore…  The number of people who visited AISL-supported museum exhibits in a given year –The outcomes associated with museum exhibits (and the evidence that those outcomes were attained) –The number of AISL-funded museum projects that target youth—and the strategies being used to engage this population  The number of AISL projects focusing on biological sciences in after school programs  The extent to which post-secondary institutions are partnering with museums or exhibit designers  The outcomes associated with AISL projects employing gaming strategies with persons living in rural communities

The database can also be used to examine narrative information for a given project type Strategies used to reach international audiences and engage specific target groups Factors affecting implementation of a given deliverable Factors hindering attainment of a specific outcome Lessons learned Future plans Project implementation Progress made addressing a given research question Most significant accomplishment Evidence that a public/professional audience outcome was met How project advanced knowledge about ISE or a field Significant innovations that occurred as a result of the project Project outcomes 36

Gary Silverstein (301) Ashley Simpkins (240)