Evaluation of Efforts to Broaden STEM Participation: Results from A Two-Day Workshop Planning Committee: Bernice Anderson Elmima Johnson Beatriz Chu Clewell.

Slides:



Advertisements
Similar presentations
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Advertisements

Donald T. Simeon Caribbean Health Research Council
Principles of Standards and Measures
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Subcommittee on STEM Learning and STEM Learning Environments The Subcommittee on STEM Learning and STEM Learning Environments offers three overarching.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
August 2013 School of Medicine Strategic Planning Community Engagement Committee.
An Excellent Proposal is a Good Idea, Well Expressed, With A Clear Indication of Methods for Pursuing the Idea, Evaluating the Findings, and Making Them.
SEM Planning Model.
Part Two: Organizational Domains and Considerations Defining and Applying Cultural Competence for Kansas SPF-SIG Prevention Programs and Services.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
The Academic Assessment Process
Quality evaluation and improvement for Internal Audit
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation and Human Resources Focus: discuss how evaluation of schools is conducted and where the emphasis should be placed in these evaluations. Thesis:
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
Training and Learning Needs Analysis (TLNA) a tool to promote effective workplace learning & development Helen Mason, Project Worker, Unionlearn Representing.
2014 AmeriCorps External Reviewer Training
Report to the Board of Education October 15, 2007.
Future Directions Strategy Implementation Professor Liz Thomas Dr Helen May.
Directorate for Education and Human Resources (EHR) EHR Core Research Program (ECR) Program Announcement: NSF
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Viewpoints for Student Partnerships Carry out a baseline study to research current position. Establish the case for student partnerships and align with.
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Training of Process Facilitators Training of Process Facilitators.
FY Division of Human Resources Development Combined COV COV PRESENTATION TO ADVISORY COMMITTEE January 7, 2014.
Broadening Participation Activities in Chemistry Celeste M. Rohlfing National Science Foundation Chemistry Division September 20, 2007.
Program Evaluation and Logic Models
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Rethinking Homelessness Their Future Depends on it!
Outcome Based Evaluation for Digital Library Projects and Services
Workshop on Programming in support of Anti-Corruption Agencies Bratislava, 30 June - 1 July 2009 A methodology for capacity assessment of AC agencies:
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Promoting Diversity at the Graduate Level in Mathematics: A National Forum MSRI October 16, 2008 Deborah Lockhart Executive Officer, Division of Mathematical.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
David Mogk Dept. of Earth Sciences Montana State University April 8, 2015 Webinar SAGE/GAGE FACILITIES SUPPORTING BROADER EDUCATIONAL IMPACTS: SOME CONTEXTS.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Office of Service Quality
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
NSF INCLUDES Inclusion Across the Nation of Learners of Underrepresented Discoverers in Engineering and Science AISL PI Meeting, March 1, 2016 Sylvia M.
AACN – Manatt Study In February 2015, the AACN Board of Directors commissioned Manatt Health to conduct a study on how to position academic nursing to.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Board on science education
DATA COLLECTION METHODS IN NURSING RESEARCH
Texas Association of Community Colleges
Governance and leadership roles for equality and diversity in Colleges
Implementation Guide for Linking Adults to Opportunity
Monitoring and Evaluating FGM/C abandonment programs
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

Evaluation of Efforts to Broaden STEM Participation: Results from A Two-Day Workshop Planning Committee: Bernice Anderson Elmima Johnson Beatriz Chu Clewell Norman Fortenberry Presenters: Patricia B. Campbell & Veronica Thomas

Evaluation of Efforts to Broaden STEM Participation: Workshop Goals To develop and validate a strategy by which to demonstrate the value of NSF's investment in broadening participation (BP). To negotiate answers to two questions: 1.What metrics should be used for project monitoring? 2.What designs and indicators should be used for program evaluation?

Evaluation of Efforts to Broaden STEM Participation: The Workshop Report The Policy Context for NSF Programs for Broadening Participation (Fortenbury) Measuring Success and Effectiveness in NSF’s Broadening Participation Programs (Clewell) Outcomes and Indicators Related to Broadening Participation (Campbell, Thomas, & Stoll) Evaluating Efforts to Broaden Participation (Campbell, Stoll, & Thomas) Implications of the NSF Broader Impacts Statement (Nelson & Bramwell)

The Policy Context: Historically NSF’s goal of broadening participation has been shaped through a variety of policy actions by the legislative and executive branches of government. Within the agency itself, policies articulated by the National Science Board (NSB) and the Committee on Equal Opportunity in Science and Engineering (CEOSE) have informed the NSF approach and strategy to address this goal, as referenced in major policy documents issued by NSF.

The Policy Context: Currently Broadening Participation at the National Science Foundation: A Framework for Action (May 2008), outlines the NSF-wide broadening participation plan. It provides guidelines for broadening participation both externally and internally, through: Expanding the reviewer pool Training NSF staff and reviewers Enforcing accountability for NSF staff and principal investigators Communicating promising practices Maintaining and monitoring a portfolio of relevant programs

The Policy Context: A Core Value and A Strategic Goal Broadly Inclusive: seeking and accommodating contributions from all sources while reaching out especially to groups that have been under- represented; serving scientists, engineers, educators, students, and the public across the nation; and exploring every opportunity for partnerships, both nationally and internationally. Investing in America's Future: Strategic Plan FY , National Science Foundation, NSF 06-48, National Science Foundation, Arlington, VA, 2006.

Measuring Success: NSF – BP Programs Broadening Participation Focused Programs (28 Programs; 17 Require Evaluations) Programs with Emphasis on Broadening Participation (17 Programs; 8 Require Evaluation) Programs with Broadening Participation Potential (16 Programs; 9 Require Evaluation) Other Broadening Participation Efforts (5 Programs).

Measuring Success: Suggested Monitoring Metrics Institution Focused Targeted Programs Goal: Increase research capability and teaching effectiveness Baseline data: Collaborative relationships, Funding distribution, % URM students, Total enrollment Follow-up: Collaborative relationships established, Funding support obtained, Teaching reforms effected

Measuring Success: Recommendations It is recommended that the NSF: Conduct periodic evaluations, including external reviews ranging from the program level to larger cross-sections of the portfolio Develop a common framework requiring that BP projects collect uniform data Review all funded programs to determine:  If program funds serve a representative proportion of members of under-represented groups or institutions;  If positive outcomes of programs are distributed equitably among all groups of participants or institutions.

Broadening Participation (BP) : Critical Issues Related to Indicators and Outcomes Developing shared understanding and clarifying meaning Addressing ‘‘success” at multiple levels

Important Distinctions Inputs Outputs Process Outcomes

Inputs Resources, contributions, investments that go into the project Input indicators measure resources, contributions and investments such as:  Staff  Volunteers  Funding  Materials  Facilities  Investments made to support BP

Outputs Units of services and goods provided by the project Output indicators measure things such as the scope/size of activities, services, events, and products reaching underrepresented  Numbers of students served  Numbers of workshops

Process Ways in which project services and goods are provided Process indicators measures extent to which BP projects, programs, and strategies delivered as intended (alignment)

Outcomes Things project hopes to achieve; actual benefits, impact, or changes Outcomes indicators expressed in terms of changes for individuals, groups, communities, institutions, and system :  Knowledge, attitude, and skill changes  Behavior changes  Value changes  Policy, procedural, and practice changes

Considering BP Success at Multiple Levels Level 1: Having access to the benefits of STEM knowledge Level 2: Having access to STEM knowledge Level 3: Studying STEM Level 4: Working in STEM areas Level 5: Generating STEM knowledge

Problems in Determining “Success”  Defining in terms of increase in absolute number or percentage  Defining in terms of increase in both number and percentage  Defining in terms of the end point being “parity” (absolute number)

Other Considerations in Defining Success  Defining “parity” as a range  Achieving parity, as more participate overall  Considering discipline/field size to which definition of success apply  Integrating qualitative indicators (e.g., broadening and transforming perspectives)

Other Indicators of Success Broadening Participation  Individual level indicators  Institutional level indicators  Foundation level indicators

Individual (Student) Level Indicators  Participation  Retention, persistence, and success  Experiences  Attitudes

Institutional Level Indicators Staffing Policies, programs, and institutional commitment Accountability and rewards Monitoring, tracking, and using data for improvement Collaborations

Foundation Level Indicators Inclusion of information about importance of BP Review and monitoring of foundation policies/practices in terms of potential to broaden participation Diversity of professional involved with the Foundation Foundation resources devoted to BP Improvements to knowledge base about broadening participation Implementation of effective strategies at Foundation level to BP

Evaluating BP: Research vs. Evaluation Goal Research: To move the knowledge base forward. Evaluation: To assess quality/effectiveness. Outcome Research: Why something does or doesn’t. Evaluation: If something does or doesn’t work. Focus Research: The research. Evaluation: The program/intervention. Designs/measures/analysis: No difference

Evaluating BP: Longitudinal Tracking Being able to follow students longitudinally is the key to any sophisticated understanding of how colleges are doing and what's happening to students. - Thomas R. Bailey, 2008 Without longitudinal data, the generation and testing of causal models tied to successful participation in STEM for diverse populations will be difficult if not impossible.

Evaluating BP: Using Comparison Groups

Evaluating BP: Selecting Designs The appropriateness of the fit between the design of the program or “intervention” and the requirements of more rigorous evaluation methodologies. The timing of the evaluation. The balance between the level of investment in the evaluation and the level of investment in and the intensity of the intervention. The level of evidence expected given the nature of the intervention. The strength of rival hypotheses.

Evaluating BP: Selecting Designs Study TypeDesignRepresentation Typical questions answered by the design Quantitative Case Study One-shot Post-test only Design X O After attending a preview weekend are at least 50% of the students planning to apply to the institution? Quasi- experimental Study One-shot Pre-test- Post-test Design O a X O b Does working with a role model increase girls’ interest in science careers?