© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.

Slides:



Advertisements
Similar presentations
BPC Alliance Workshop December 7, 2006 Washington, DC HBCU-Research Alliance Evaluation Update Lecia Barker Tim Weston University of Colorado, Boulder.
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Leon County Schools Performance Feedback Process August 2006 For more information
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Cedarville University Accreditation Self-Study Plan Presented by Dr. Thomas Mach.
REL Appalachia and the Virginia Middle School Research Alliance Justin Baer, Director, REL Appalachia Virginia School-University Partnership Steering Committee.
Using Data to Support Statewide initiatives centered on Student Achievement A look at publically available data for use by RSA’s, Districts, and schools.
Student Services Personnel and RtI: Bridging the Skill Gap FASSA Institute George M. Batsche Professor and Co-Director Institute for School Reform Florida.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
Adding Value to the MSP Evaluations Norman Webb, Rob Meyer, and Paula White Wisconsin Center for Education Research University of Wisconsin-Madison 1025.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Center For Technology in Learning SRI International Investigating the Cumulative Impacts of Educational Technology Barbara Means, Mary Wagner, Geneva Haertel.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Evaluating NSF Programs
+ STARS Evaluation Assistant Webinar 1 September 19, 2014 Evaluation Projects.
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Goals of This Session Provide background for program review development Describe document make-up.
Assessment Leader Training General Education Student Learning Assessment GEO Training Series: 2 of 5 Spring 2012 February 13, 2012.
Developing the Self-Study Document Using Integrated Assessment Briefs Millersville University of Pennsylvania Presented by: Dr. Thomas Burns, Associate.
Academic Assessment Task Force Report August 15, 2013 Lori Escallier, Co-Chair Keith Sheppard, Co-Chair Chuck Taber, Co-Chair.
University of Wisconsin Population Health Institute Kit R. Van Stelle, Principal Investigator Treatment Alternatives and Diversion (TAD)
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
Institutional Change and Sustainability: Lessons Learned from MSPs Nancy Shapiro & Jennifer Frank CASHÉ KMD Project University System of Maryland January.
Creating a Culture of Student Affairs Assessment Katie Busby, Ph.D. Jessica Simmons Office of Student Affairs Assessment & Planning University of Alabama.
Proposed National SET Goals for 2009 National SET Mission Mandate Team and National 4-H Council.
University Assessment Council Presented by co-chairs of UAC: Linda Shaw, CHABSS faculty Regina Eisenbach, Dean of Academic Programs May 6, 2015.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Cleveland State University Self Study 2010 North Central Association/Higher Learning Commission Accreditation.
Innovation through Institutional Integration (I 3 ) National Science Foundation Directorate for Education and Human Resources National Science Foundation.
Western Carolina University Office of Assessment A Division of the Office of the Provost.
Assessment for Student Learning Kick-Off: Assessment Fellows Assessment Coordinators Pat Hulsebosch Ex. Director-Office of Academic Quality August 28,
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Eight Schools Information Security Policies: The Process at Hotchkiss.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
STATE INNOVATION PLAN. Purpose To test whether new payment and service delivery models will produce superior results when implemented in thee context.
The Importance of Professional Learning in Systems Reform AdvancED Performance Accreditation Bev Mortimer Concordia Superintendent
GT Research Data Project Team Original Charge: to investigate, evaluate, assess, and communicate Georgia Tech researchers’ data practices, processes, and.
1 Restructuring Webinar Dr. Zollie Stevenson, Jr., Ph.D. Director Student Achievement and School Accountability Programs Office of Elementary and Secondary.
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
SAN JOSE STATE UNIVERSITY SCHOOL OF SOCIAL WORK FIELD INSTRUCTION INITIATIVE PARTNERED RESEARCH PROJECT Laurie Drabble, Ph.D., MSW, MPH Kathy Lemon Osterling,
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Cañada College Professional Development Committee Determining Participants.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
National Institutes of Health U.S. Department of Health and Human Services Planning for a Team Science Evaluation ∞ NIEHS: Children’s Health Exposure Analysis.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
MUHC Innovation Model.
Evaluating SPP/APR Improvement Activities
America’s Promise Evaluation What is it and what should you expect?
Research on UTeach: Past, Present, and Future Research
Evaluating SPP/APR Improvement Activities
Quality Enhancement Plan
Future of Public Health in Kansas: Local Pilot
Supporting Faculty Research
Presentation transcript:

© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond McGhee and Dr. Nancy Adelman

© 2007 SRI International2 Presentation Goals Provide background on program evaluation, leadership and team Describe purposes of the evaluation Describe each component of program evaluation and how it affects the projects and their leadership Share lessons learned from evaluation design and data collection so far Answer questions to clarify any aspect of the evaluation

© 2007 SRI International3 Evaluation Leadership and Experience Contracted by NSF to serve as an outside, third party evaluation contractor Nancy Adelman – Project Director Ray McGhee – Deputy Director Team of twelve researchers and analysts Experience in large-scale program evaluations

© 2007 SRI International4 Program Evaluation Purposes Describe and document program implementation Assess how the program is promoting change in undergraduate computing Support NSF in it’s reporting to the Academic Competitiveness Council.

© 2007 SRI International5 Project Evaluation Purposes Provide formative feedback to facilitate project refinements Describe initial implementation activities, successes, and challenges Monitor status of project activities Collect evidence of project successes and outcomes

© 2007 SRI International6 Relation between Program and Project Evaluation Data from projects will be incorporated into the program evaluation monitoring efforts Data from Special project evaluators will provide data to be used to document the program’s impact Data from the program evaluation will be reported in the aggregate and disseminated to all the projects

© 2007 SRI International7 Evaluation Data Sources 1. Monitoring Survey Data on project implementation for all projects Data will be used by NSF to monitor and track project activities Data analyses will be made available to projects in aggregate Timeline of first data collection: Feb 2010 – mid April 2010 Analysis & reporting: late May 2010

© 2007 SRI International8 Evaluation Data Sources 2. Site Visits Data for a cross-section of projects in alternating years to document project implementation in context from the participants’ perspectives Purposive sample with case selection criteria: project type, cohort year, location, institution type Data will be analyzed across projects and reported in the aggregate to NSF and projects Timeline: April – May (some in summer)

© 2007 SRI International9 Evaluation Data Sources 3. Analysis of Special Evaluations America Competes Act (HR 2272) mandates rigorous designs for evaluating STEM education programs SRI is supporting 24 projects conducting evaluations using quasi- experimental designs and will incorporate into analysis Coordination and communication between evaluator and PI/project team will be important Timeline: April – August 2010

© 2007 SRI International10 Evaluation Data Sources 4. Faculty Survey Sampling faculty from all projects Faculty attitudes and practices within departments promoting change in computing Survey development and piloting will occur in fall 2010 Data will be presented to NSF and projects in the aggregate Timeline: February – April 2011 & 2013

© 2007 SRI International11 Additional Resources from SRI CPATH Monitor Web site Hosting meetings of the special evaluators to support quasi-experimental designs Assistance in completing monitoring survey

© 2007 SRI International12 Lessons Learned The evolving nature of computing education reform and the diverse reform activities being implemented The role of institutional and regional context Avoiding the premature assessment of program impact Growing interest among different stakeholders Shared interests and benefits related to evaluation (for the CPATH program and projects)

© 2007 SRI International13 Summary of Next Steps in Evaluation Finalizing monitoring survey data collection and analysis Conducting site visits and analyzing data Meeting with special evaluators and collecting impact data Faculty survey development, pilot, clearance, and administration Annual reporting to NSF and ACC

© 2007 SRI International14 Questions & Answers Wrap-Up