Evaluating NSF Programs

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

1.) Identify the learning goals of one of your campus CIRTL programs. To provide a diverse group of STEM Ph.D. students with mentored teaching and research.
Cathy Jordan, PhD Associate Professor of Pediatrics Director, Children, Youth and Family Consortium University of Minnesota Member, Community Campus Partnerships.
National Academy of Engineering of the National Academies 1 Phase II: Educating the 2020 Engineer Phase II: Adapting Engineering Education to the New Century...
Survey Responses Challenges and Opportunities Matt Richey St. Olaf College.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Geography FACULTY OF Environment Living with Difference in Europe: making communities out of strangers in an era of super mobility and super diversity.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
Completion and Attrition in AGEP and non-AGEP Institutions National Science Foundation Grant # CGS Webinar January 17, 2012 Robert Sowell Jeff Allum.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
Addressing the Challenges of Graduate and Post-graduate Training in the Geosciences Margaret Leinen Assistant Director for Geosciences National Science.
Selected Results from the Robert Noyce Teacher Scholarship Program Evaluation Frances Lawrenz Christina Madsen University of Minnesota.
Noyce Program Evaluation Conference Thursday, December 6, 2007 Frances Lawrenz Michelle Fleming Pey-Yan Liou Christina Madsen Karen Hofstad-Parkhill 1.
Evaluation. Practical Evaluation Michael Quinn Patton.
Survey Designs EDUC 640- Dr. William M. Bauer
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Preliminary Highlights from the Noyce National Program Evaluation May 30, 2013 Ellen Bobronnikov Cris Price.
Program Evaluation Using qualitative & qualitative methods.
Evaluation Process/Requirements for CAPP Algebra Project.
Strategic Plan for Enrollment Management Taskforce Presentation August 24, 2010 Recruitment Sub-group.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
Designing an Evaluation of the Effectiveness of NIH’s Extramural Loan Repayment Programs.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Completion and Attrition in AGEP and non-AGEP Institutions Technical Workshop CGS Annual Meeting December 10, 2011 Robert Sowell Jeff Allum Nathan Bell.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
Outcome Based Evaluation for Digital Library Projects and Services
Ashley Briggs, Ed.D., case study lead Felix Fernandez, Ph.D, implementation lead ICF International 1 July 8, 2015.
College Board EXCELerator Schools Site Visit Preparation.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
HECSE Quality Indicators for Leadership Preparation.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
Scope ACES: Purpose and Goals The Academic Careers in Engineering & Science (ACES) program at Case Western Reserve University (CWRU) is part of the National.
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Designing an Evaluation Framework for Retaining Students in STEM PhD Programs The 3rd Annual Alliance for Graduate Education & the Professoriate (AGEP)
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
IGERT at the National Science Foundation Carol Van Hartesveldt, Ph.D. Program Director, IGERT National Science Foundation.
1 Preparing an NIH Institutional Training Grant Application Rod Ulane, Ph.D. NIH Research Training Officer Office of Extramural Research, NIH.
Nursing research Is a systematic inquiry into a subject that uses various approach quantitative and qualitative methods) to answer questions and solve.
West Central Community School District Performance Document: Formative Evaluation Tool By John Johnson ortheast Iowa Charter School Northeast Charter School.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Week 2 The lecture for this week is designed to provide students with a general overview of 1) quantitative/qualitative research strategies and 2) 21st.
Draft of the Conceptual Framework for Evaluation & Assessment of the National Science Foundation (NSF) Alliance for Graduate Education & the Professoriate.
NIAMS Training Grant and Career Development Award Program Evaluation Presented by David Wofsy, M.D. Chairman Evaluation Working Group September 27, 2007.
NSF ADVANCE: Institutional Transformation for Faculty Diversity The University of Texas at El Paso April 2004 Evelyn Posey, Department of English Libby.
Overview Presentation Mary Lynn Realff Co-PI and Project Director NSF Site Visit June 8, 2004 GT NSF ADVANCE – taking an integrated approach to institutional.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
1 L. Gabriel Navar Department of Physiology Tulane University School of Medicine New Orleans, Louisiana Transition from Postdoctoral Fellow to Junior Faculty:
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Program Evaluation Essentials-- Part 2
Presentation transcript:

Evaluating NSF Programs Dr. Jennifer Giancola Carney, Abt Associates September 18, 2008

Agenda Two NSF program evaluations (IGERT & CAREER) Q&A discussion Design & findings Rationale for methods used Limitations of methods used Lessons learned Q&A discussion Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Since 1998, PhD training program (DGE) Integrative Graduate Education and Research Traineeships (IGERT) Program Since 1998, PhD training program (DGE) Grants to universities who develop new IGERT-related programs (most $$  student traineeships) Give PhD students interdisciplinary research experiences and enhanced professional skills & perspectives Three phases of evaluation Implementation study (1999-2002) Impact study (2003-2005) Follow-up study of graduates (2006-present) Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

IGERT Evaluation: Began with a Logic Model Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

IGERT Implementation Study Annual Monitoring: “Who? What? When?” research questions Who participates and why? What activities are conducted? Annual web survey of program participants (PIs & trainees) Describe the program recruitment strategies, training activities, faculty involvement Site Visits: “How? Why?” research questions What challenges have projects encountered? How have they overcome them? Interviews with faculty, students, chairs, administrators Identify common challenges and solutions project management, faculty engagement, implementing interdisciplinary education within universities Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

IGERT Implementation Study (cont.) Examine implementation across projects and over time Mixed methods (quantitative & qualitative data) Data used for GPRA reporting, program management, revisions to solicitations, sharing common solutions with IGERT PIs Limitations Little information on longer-term effects of IGERT or broader program impacts on faculty and the university No comparison to non-IGERT experiences to take into account overall trends in graduate education Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

2003 IGERT Impact Study Impact study: So what?” research questions What have been the outcomes for participating IGERT faculty and students as compared to non-participating faculty and students? Has there been any institutional impact of IGERT funding? IGERT participants (PIs, dept chairs, faculty, students, administrators) and Non-IGERT participants (comparison group) Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

IGERT Comparison Group Provide a counterfactual for what would have been had IGERT not existed. Needs to control for “academic quality” and variations among STEM disciplines Matched each IGERT department to a non-IGERT department with whom they compete for graduate students Vulnerable to selection bias: Outcomes may be due to pre-existing characteristics of IGERT students, not to IGERT program Examples of reported findings: Can say: “IGERT trainees engage in more interdisciplinary activities as graduate students than non-IGERT students.” Cannot say: “IGERT causes students to engage in more interdisciplinary activities.” (Maybe these students would have sought out i/d activities regardless.) Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

IGERT Impact Study (cont.) Benefits Examine value of IGERT for: Departmental recruitment Student preparation Faculty interdisciplinary involvement Institutional offerings and support for interdisciplinary education Assess against counterfactual of “traditional” graduate ed. Limitations Focused on current participants Tested lots of outcomes – hypothesis generating (not confirming) No data on longer term outcomes for graduates Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

2006 IGERT Follow-up Study Graduate study: “What?” “So what?” questions: Where do IGERT graduates go and what do they do? Are they any different from non-IGERT graduates? Has IGERT helped prepare them for their chosen careers? IGERT graduates and comparison group of non-IGERT graduates Presenting detailed descriptive data on IGERT graduates Limiting outcomes tested with comparison group to key outcomes (hypothesis confirming, though still selection bias) Challenge: locating graduates Monitoring system had info on point of contact Easier to find those in academic positions versus non-academic positions. Introduces sample bias into results – will conduct non-response bias during analysis. Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Faculty Early Career Development (CAREER) Program NSF’s primary support mechanism for junior faculty members since 1995. Grants to individual faculty members Support the research and early career advancement of junior researchers Promote the integration of research and education: Individual awardees Changing university culture Most recent evaluation: 2005-2008 Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

CAREER: Research Questions Descriptive questions re: perceptions of CAREER How do stakeholders at NSF perceive the CAREER program and its relationship to the mission of NSF? How do faculty members in departments that host CAREER awardee(s) view the CAREER program and its relationship to their research and educational missions? Impact questions What is the impact of CAREER on the research activities and career advancement of awardees? What is the impact of CAREER on the integration of research and education by faculty members? Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

CAREER: Methodology Descriptive Study Interviews with NSF Program Officers Survey of 700 department chairs Site visits 22 departments Samples representative of population in question (but no comparison group) Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

CAREER: Methodology (cont.) Quasi-experimental evaluation of impact on awardees CAREER Awardees Comparison group of Non-Awardees (same research potential and interest in integration of research and education) Unsuccessful CAREER applicants who won another NSF grant as PI w/in 5 years of CAREER application Matched using propensity scores (reduces selection bias) Limited outcomes tested (confirming hypotheses) Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

CAREER: Findings Description of how program goals are interpreted within and outside of NSF  Inform program management Description of characteristics of awarded PIs  NSF program reporting (GPRA, etc.) Assessment of grant’s impact on awardees (“Receipt of a CAREER award increases the likelihood of receiving tenure”)  Inform decisions about program continuation or modification Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Lessons learned Know thy program: Until you understand the intervention, you cannot assess outcomes Logic model, articulate goals Develop indicators / measures of program success Clearly define your research questions Prioritize - you cannot evaluate everything Identify data needs for reporting, decision-making Be realistic (ask questions that can be answered about indicators that can be measured) CAREER: “impact on institutional culture” Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Lessons learned (cont.) Identifying appropriate comparison groups What’s the right counterfactual? Each comparison option allows you to answer different questions. Choose the option which best addresses the research questions. IGERT - other interdisciplinary programs? Same or different institutions? All STEM students nationwide? Choose right level of rigor (developing or testing hypotheses?) Consider risk of selection bias Change over time (longitudinal studies; pre/post) Take advantage of available data available National datasets CAREER – data available to do PS matching Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Lessons learned (cont.) Ground each subsequent phase in findings from previous work: Work from exploratory / descriptive evaluation to more summative / confirmatory evaluation. Each phase can answer questions raised (or not answered) in previous phases. IGERT: Implementation  Impact  Graduate Follow-up Take advantage of different levels of data collection Qualitative versus quantitative; single versus cross-site IGERT: Richness of single site visits enabled future studies Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Lessons learned (cont.) Think long term Begin evaluation when program begins Plan now for information you will need in the future IGERT: tracking graduates New study (GK-12) – building comparison group today for work in future Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008

Questions? Abt Associates AGEP Capacity Building Meeting Presentation 9/18/2008