AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.

Slides:



Advertisements
Similar presentations
Planning Collaborative Spaces in Libraries
Advertisements

Model of Minority Student Retention
1.) Identify the learning goals of one of your campus CIRTL programs. To provide a diverse group of STEM Ph.D. students with mentored teaching and research.
Florida State University PhD Completion Project Phases I & II Nancy Marcus Judith Devine.
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
School Improvement Through Capacity Building The PLC Process.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Career Planning in Academic Medicine David Coleman, M.D. Faculty Development Seminar Tuesday, September 18, 2007.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Dr Chris Boomer Development Plans Manager (DoE Northern Ireland)
Completion and Attrition in AGEP and non-AGEP Institutions National Science Foundation Grant # CGS Webinar January 17, 2012 Robert Sowell Jeff Allum.
Proposal Writing Workshop Features of Effective Proposals: Fellowship Track Washington, DC January 9, 2014.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Data, Exhibits and Performance-based Assessment Systems David C. Smith, Dean Emeritus College of Education University of Florida
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Evaluating NSF Programs
Student Data Reporting Tools Michele J. Hansen, Ph.D. Executive Director Student Data, Analysis, and Evaluation Enrollment Management Advisory Council.
CUNY-MAA/SBES AGEP ALLIANCE Broadening Participation in SBES: Strategies for Implementation Through a CUNY & Michigan AGEP Alliance (CUNY/MAA) AAAS SBES-AGEP.
Reporting and Using Evaluation Results Presented on 6/18/15.
Print & Online Products to be Produced by AAAS AGEP REC Grant and Other Follow up Activities.
Phase II - Year One STC Broadening Participation Workshop Diversifying the Science and Engineering Workforce March 21 – 23, 2007 Albert L. McHenry,
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
Adapted from a presentation by Mark Lieu Academic Senate for California Community Colleges - Leadership Institute 2006 Academic Senate for California Community.
Proposal Writing Workshop Features of Effective Proposals.
Diversity in Graduate Education: Reflections and Realities UGA Teaching Academy Academic Affairs Faculty Symposium Friday, March 27, 2015 Michelle Cook,
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Curriculum and Assessment in Northern Ireland
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Research Data Management Services Katherine McNeill Social Sciences Librarians Boot Camp June 1, 2012.
Completion and Attrition in AGEP and non-AGEP Institutions Technical Workshop CGS Annual Meeting December 10, 2011 Robert Sowell Jeff Allum Nathan Bell.
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
The Impact of Health Coaching
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Promotion Process A how-to for DEOs. How is a promotion review initiated? Required in the final probationary year of a tenure track appointment (year.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Designing an Evaluation Framework for Retaining Students in STEM PhD Programs The 3rd Annual Alliance for Graduate Education & the Professoriate (AGEP)
1 Alliances for Graduate Education and the Professoriate Comparison Groups and Other issues September 18, 2008 by Catherine M. Millett, Ph.D. Policy Evaluation.
Proposal Writing Workshop Features of Effective Proposals.
Proposal Writing Workshop Features of Effective Proposals.
Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
Draft of the Conceptual Framework for Evaluation & Assessment of the National Science Foundation (NSF) Alliance for Graduate Education & the Professoriate.
AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
University of Warsaw. The quality of education assurance and enhancement system at the University of Warsaw.
Using Classroom Data to Monitor Student Progress Lani Seikaly, Project Director School Improvement in Maryland Web Site.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Evaluator Training Workshop March 1, 2012 Jeff Jordan Vice President for Student Life Seattle Pacific University.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Monitoring and evaluation of disability-inclusive development
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Southampton City Council School School Improvement Service
SACSCOC Fifth-Year Readiness Audit
National Workshop on Planning for the GEOHealth Hub for Interdisciplinary Research and Training Policy, Regulatory and Organizational Frameworks Getnet.
DRAFT Standards for the Accreditation of e-Learning Programs
School Self-Evaluation 
Teaching Excellence Development Fund
  PREM Annual Reporting “These guidelines were developed to provide a uniform reporting structure for the Partnerships in Research and Education in Materials.
Florida State University PhD Completion Project Phases I & II
Fort Valley State University
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs

2 Objectives Identifying methods and questions for evaluation studies related to STEM graduate student progression to the PhD and professoriate, including admissions/selections, retention/attrition, PhD completion, and post-doctoral experiences, including collection of quantitative and qualitative data. Identifying methods and questions for Alliance evaluations, particularly in terms of progression to PhD and the professoriate. What can AGEPs learn from cross institutional studies?

3 As you listen to presentations…. What research informed the design of the study? What type of data was collected? What was the rationale for deciding to collect this data? What methods were used? What was the rationale for selecting methods used? How were comparisons groups constructed? What are the reporting limitations in regards to the construction of the comparison groups?

4 Another Objective for this AGEP Meeting Developing and writing impact statements or highlights (nuggets) that include data for use in: AGEP NSF Annual Reports Findings section AGEP Supplemental Report Questions NSF Highlights Brochures and Web sites

5 The poster should include quantitative and qualitative data that provides evidence of: Graduate student changes for selected STEM fields or all STEM fields Infrastructure changes. This can include changes in institutional or departmental polices or practices Alliance impact. This can include changes in institutional or departmental policies or practices related to graduate school affairs, postdoctoral arrangements, or faculty hiring. Stories and pictures are welcome but the major emphasis must be on quantitative and, as appropriate, qualitative data. Program descriptions need to be kept to a minimum and put in the context of the data behind decisions to keep or eliminate strategies. A focus can be on what works and what doesn't as long as the emphasis on the data that showed whether different strategies worked on not.

6 Impact Evaluations and Statements An impact evaluation measures the program's effects and the extent to which its goals were attained. Although evaluation designs may produce useful information about a program's effectiveness, some may produce more useful information than others. For example, designs that track effects over extended time periods (time series designs) are generally superior to those that simply compare periods before and after intervention (pre- post designs); Comparison group designs are superior to those that lack any basis for comparison; and Designs that use true control groups (experimental designs) have the greatest potential for producing authoritative results.

7

8

9 Strategies that Matter for Graduate Student Retention & Progression to the PhD Student admissions/selection criteria Financial aid packages that reduces debt burden Mentoring (Faculty and staff) Supplementary academic support in writing, statistics, and other subjects Social integration into department Early intellectual integration into research projects Research productivity (posters, papers, etc) Attention to PhD milestones Attention to family/work balance Institutional and departmental programs and practices

10 Given limited evaluation budgets: Use evaluators to conceptualize and design evaluation instruments. Don’t evaluate every component of the program each year. Look for natural opportunities to conduct an evaluation. Make evaluation a part of the implementation project Use electronic student systems Involve all faculty and staff in data collection and analysis

11 Work Groups Two Groups 1.What types of studies and evaluations are you already doing to measure retention/attrition or progression to the PhD? What types of comparisons groups are you using in these studies? Two Groups 2. What types of studies and evaluations are you already doing to measure institutional impact? What types of comparisons groups are you using in these studies? Lead Alliance Leaders 3. What types of studies and evaluations are you already doing to measure Alliance impact? What types of comparisons groups are you using in these studies?

12 Work Groups Continued (All Groups) One Group 4. What type of studies and evaluations are you already doing to measure progression and retention in the professoriate? What types of comparisons groups are you using in these studies? All Groups 5. What are other natural opportunities for collecting evaluation data? What types of comparisons groups would you use in these studies? 6. What are some solutions to IRB challenges?

13 Homework Write an impact statement about graduate student changes, as a result of AGEP. Write an impact statement about your institutional changes, as a result of AGEP. Write an impact statement about your Alliance, as a result of AGEP. Write an impact statement about progression and retention in the STEM professoriate, as a result of AGEP.

14 In summary, evaluation Examination of something in order to judge its value, quality, importance, extent, or condition Part of the ongoing program implementation Meaningful activity among the entire project team, including faculty and administrators