AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.

Slides:



Advertisements
Similar presentations
BPC-NCWIT Evaluation Workshop Boulder, CO May 16, 2007 Evaluation of the BPC Alliance Projects Daryl E. Chubin AAAS Capacity Center
Advertisements

School Improvement Through Capacity Building The PLC Process.
Delivering effective enterprise education: the role of learning design and technology Professor Pauric McGowan University of Ulster Dr Richard Blundel.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
HIV Capacity Building Summit  March 19, 2013  Johannesburg Building local NGO capacity, effectively and sustainably: Implications of selected USAID-
Completion and Attrition in AGEP and non-AGEP Institutions National Science Foundation Grant # CGS Webinar January 17, 2012 Robert Sowell Jeff Allum.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Proposal Writing Workshop Features of Effective Proposals: Fellowship Track Washington, DC January 9, 2014.
Alliance RtI Intervention Project Due on or before January 10, 2012.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
 To assess the learners achievement at the end of a teaching-learning process, for instance, at the end of the unit.  Measures the learners attainment.
SAH Year 2: CHANGE Reassessment Shannon Griffin-Blake, Ph.D. Team Lead, Program Services and Evaluation CDC’s Healthy Communities Program National Center.
The ARC Electronic Toolkit An Introduction to the Content and Effective Use.
RESEARCH DESIGNS FOR QUANTITATIVE STUDIES. What is a research design?  A researcher’s overall plan for obtaining answers to the research questions or.
Evaluating NSF Programs
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
1 Student Success Plans Regional Meeting February 9, 2007 Youngstown State University Office of Assessment Sharon Stringer
CUNY-MAA/SBES AGEP ALLIANCE Broadening Participation in SBES: Strategies for Implementation Through a CUNY & Michigan AGEP Alliance (CUNY/MAA) AAAS SBES-AGEP.
Alliance for Graduate Education and Professoriate (AGEP) Data Results Yolanda George AAAS Patricia Campbell Tom Kibler Rosa Carson Campbell-Kibler.
Print & Online Products to be Produced by AAAS AGEP REC Grant and Other Follow up Activities.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Skunk Works Evaluation Tools: How do we know if we are having an impact?
Alliance for Graduate Education and Professoriate (AGEP) 2006 Data Collection Information Yolanda George AAAS Patricia Campbell Tom Kibler Rosa Carson.
Completion and Attrition in AGEP and non-AGEP Institutions Technical Workshop CGS Annual Meeting December 10, 2011 Robert Sowell Jeff Allum Nathan Bell.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Demonstrating Effectiveness Background and Context.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
BASELINE SURVEYS AND MONITORING OF PHARMACEUTICAL SITUATION IN COUNTRIES. Joseph Serutoke NPO/EDM WHO Uganda November 2002.
Quantitative and Qualitative Approaches
Promotion Process A how-to for DEOs. How is a promotion review initiated? Required in the final probationary year of a tenure track appointment (year.
Alliance for Graduate Education and Professoriate (AGEP) Report on Data and Upcoming Data Collection (The News is Good) Yolanda George AAAS Patricia Campbell.
AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Integrative Learning & ePortfolios: Empowering Learners to Work Toward Leadership, Social-Justice, & Social Change Goals Melissa Peet, MSW, PhD Principal.
GEO-AGEP webinar, August 2, 2012 Jessie DeAro, Ph.D. AGEP Program Officer National Science Foundation Questions:
Copyright 2003 – Cedar Enterprise Solutions, Inc. All rights reserved. Business Process Redesign & Innovation University of Maryland, University College.
Designing an Evaluation Framework for Retaining Students in STEM PhD Programs The 3rd Annual Alliance for Graduate Education & the Professoriate (AGEP)
Proposal Writing Workshop Features of Effective Proposals.
Proposal Writing Workshop Features of Effective Proposals.
Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College Making Sense of the.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
Investigating the Impact of Vocabulary Strategy Training and E-Portfolios on Vocabulary Strategy Use and the Acquisition of Academic Vocabulary by Saudi.
Introduction to Portfolios Copyright © 2015 Texas Education Agency, All rights reserved.
Draft of the Conceptual Framework for Evaluation & Assessment of the National Science Foundation (NSF) Alliance for Graduate Education & the Professoriate.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
NEW DIRECTIONS FOR EVALUATION DIRECTORATE FOR EDUCATION AND HUMAN RESOURCES Dr. Bernice Anderson Senior Advisor for Evaluation AGEP Evaluation Capacity.
Research Problem Students with poor quantitative reasoning skills are less successful in STEM fields.
Cookies and Nuggets. AGEP 2008 Data Collection Just add data, including 07/08 PhD recipients (No other changes). Data due October 30, 2008 (Sorry.
Using Classroom Data to Monitor Student Progress Lani Seikaly, Project Director School Improvement in Maryland Web Site.
Role of Training in Program Evaluation: Evidence from the Peace Corps Projects Shahid Umar Ph.D. Candidate Rockefeller College of Public.
Supporting the achievement of deaf children Assess Plan Do Review.
Exhibit 7 Contracting and Workforce Equity Statement of Legislative Intent July 9, 2012.
Open Science in the US and opportunities for Neutron and Light Sources
Business Environment
Business Environment
Business Environment
Business Environment
Layers of Evaluation & Meaningfulness of Results
Alliance for Graduate Education and Professoriate (AGEP)
Assessing Academic Programs at IPFW
Student Research Conference 2019
Fort Valley State University
Monitoring and Evaluating FGM/C abandonment programs
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs

2 Objectives Identifying methods and questions for evaluation studies related to STEM graduate student progression to the PhD and professoriate, including admissions/selections, retention/attrition, PhD completion, and post-doctoral experiences, including collection of quantitative and qualitative data. Identifying methods and questions for Alliance evaluations, particularly in terms of progression to PhD and the professoriate. What can AGEPs learn from cross institutional studies?

3 As you listen to presentations…. What research informed the design of the study? What type of data was collected? What was the rationale for deciding to collect this data? What methods were used? What was the rationale for selecting methods used? How were comparisons groups constructed? What are the reporting limitations in regards to the construction of the comparison groups?

4 Another Objective for this AGEP Meeting Developing and writing impact statements or highlights (nuggets) that include data for use in: AGEP NSF Annual Reports Findings section AGEP Supplemental Report Questions NSF Highlights Brochures and Web sites

5 The poster should include quantitative and qualitative data that provides evidence of: Graduate student changes for selected STEM fields or all STEM fields Infrastructure changes. This can include changes in institutional or departmental polices or practices Alliance impact. This can include changes in institutional or departmental policies or practices related to graduate school affairs, postdoctoral arrangements, or faculty hiring. Stories and pictures are welcome but the major emphasis must be on quantitative and, as appropriate, qualitative data. Program descriptions need to be kept to a minimum and put in the context of the data behind decisions to keep or eliminate strategies. A focus can be on what works and what doesn't as long as the emphasis on the data that showed whether different strategies worked on not.

6 Impact Evaluations and Statements An impact evaluation measures the program's effects and the extent to which its goals were attained. Although evaluation designs may produce useful information about a program's effectiveness, some may produce more useful information than others. For example, designs that track effects over extended time periods (time series designs) are generally superior to those that simply compare periods before and after intervention (pre- post designs); Comparison group designs are superior to those that lack any basis for comparison; and Designs that use true control groups (experimental designs) have the greatest potential for producing authoritative results.

7

8

9 Multiple Evidence Collection for AGEP AAAS & Campbell Kibler, Inc (Collecting trend data) Are the numbers changing? Coming Soon Carlos Rodriquez --- Impact Evaluation with Comparisons Portfolio Assessment – (announced by Bernice Anderson yesterday) Alliance Evaluation Alliance Level Evaluation (Annual Reports that include highlights with comparisons, if appropriate) You might want to re-evaluate your Alliance design in light of the need to show attribution. Read the ACC report.

10 Highlights (Nuggets) Send highlight to Include highlights in your annual reports to NSF AGEP Supplemental Annual Report Questions Send highlight to your communication office. Include highlights on your web sites, brochures, & posters

11 Evaluation Capacity Tool Kit for AGEPs What do you want in the toolkit? Do you have sample evaluations that you want to include in the evaluation toolkit?

12 Thank You