Presentation by MDRC for the Completion by Design Cross-Cadre Retreat Charlotte, NC February 2013.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Learning in Context George Siemens
The Common Core State Standards: Opportunities and Challenges for the Mathematical Education of Teachers.
Donald T. Simeon Caribbean Health Research Council
Foundations of Excellence ® in the First College Year (4-year institutions) Salisbury University Project Description of Review Process of First College.
The Role of Academic Leadership in Student Success August 21, 2012 Deans and Department Chairs` Dialogue Southern Utah University Charles Schroeder, Consultant.
A Realist Evaluation of Performance Management for Social Services Lessons from the implementation of “Outcome Management” in the voluntary sector in Singapore.
1 Repositioning the parliamentary ICT service – the place of business relationship management in the UK Parliament. Caroline Morgan Director of Development.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
Rob Johnstone Educause ELI Institute January 30, 2012 Making Your Data Work.
Project Monitoring Evaluation and Assessment
CW/MH Learning Collaborative First Statewide Leadership Convening Lessons Learned from the Readiness Assessment Tools Lisa Conradi, PsyD Project Co-Investigator.
An Assessment Primer Fall 2007 Click here to begin.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Preparation Over Time Taking Responsibility for One’s Development and Progress Ann Austin and Carol Sue Englert.
Strategic Management Process Lecture 2 COMT 492/592.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Workshop on Life History Interviews with Students University of Dar es Salaam, Tanzania,February 2007.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
David Gibbs and Teresa Morris College of San Mateo.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Using motivational interviewing to improve social workers’ engagement of fathers in child protection Jonathan Scourfield, Cardiff University Nina Maxwell,
Dr. Mark Allen Poisel Vice President for Student Affairs Georgia Regents University Today’s Transfer Students: Building a Foundation of Success Transfer.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Formative Evaluation of UNGEI Findings and Selected Recommendations Presentation to UNGEI GAC 14 February 2012.
HECSE Quality Indicators for Leadership Preparation.
1 CORAT AFRICA MANAGEMENT WORKSHOP FOR IMBISA/AMECEA COMMUNICATION COORDINATORS, MAPUTO, MOZAMBIQUE.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Right-Sizing Academic Affairs The New Normal at Appalachian State University Board of Trustees Retreat March 22, 2012.
 A review of role and responsibility assignments for those involved in research administration and compliance functions has progressed through a final.
District Assembly Session B-1 Oxnard 1 Session B-1 Designing & Planning a Global Grant 04/13/13.
MHC at its Best MHC at its Best.
BUILDING A PRIOR LEARNING ASSESSMENT PROGRAM Office for Prior Learning Assessment Joyce Lapping, Director and Panel Presenter at NEASC 126 th Annual Meeting.
Designing Qualitative Research Assoc. Prof. Dr. Şehnaz Şahinkarakaş.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Western Carolina University Office of Assessment A Division of the Office of the Provost.
MDC Strategic Plan Strategic Plan Coordinating Committee October/November 2010.
Grant Management Seminar Session 1 1 Session 1 Designing a Project 10/13/2012.
The Australian Charter for the Professional Learning of Teachers and School Leaders Understanding the Charter.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
NSW Department of Education & Training Aboriginal Education and Training Policy The Aboriginal Education and Training Policy (2008)
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
February 28.  Unit plans feedback (that I have completed)  Expectations for reflections  Pre-Internship Expectations  Questions you always wanted.
Performance Incentives to Improve Community College Completion: Lessons from Washington State Davis Jenkins, Community College Research Center Nancy Shulock,
Quality Assurance as An Empowerment Tool for Women: A Case from Saudi Arabia INQAAHE Conference, 2009 Dr. Eqbal Z. Darandari King Saud University NCAAA.
Preparing for a Special Visit: What Works Marjorie Jaasma, Roxanne Robbin, Scott Davis.
Are we there yet? Evaluating your graduation SiMR.
Copyright © 2012 by Educational Testing Service. All rights reserved
.. Requires ample preparation in lower grades Success in college is directly related to success in earlier grades Each child should have the necessary.
1 CAREER PATHWAYS Welcome to…. Module 6 Performance Management.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
FPG Child Development Institute
Developing & Refining a Theory of Action
Creating Assessable Student Learning Outcomes
Implementing Race to the Top
Be The Transformative Change
Presentation transcript:

Presentation by MDRC for the Completion by Design Cross-Cadre Retreat Charlotte, NC February 2013

MDRC Deep Dive on Institutional Change, Student Experience and Cost Student Progress Measures (KPIs) Convenings— Cadre-Wide and Cross Cadre Quarterly College Reflection Meetings College and Foundation- Generated Reports CCRC Research & Tools on Redesigning Community Colleges

 Feedback from the Cross-Cadre Advisory Committee ◦ Focus on institutional change (including culture change) ◦ Less focus on fast trials of individual interventions  Excellent work by colleges to set realistic, evidence-based improvement targets on Key Performance Indicators (KPIs)  Feedback from experts on how to research institutional change in higher education

PrincipleChange Focus on most unique aspects -learn about the change process and the new student experiences MDRC will not conduct a student outcome analysis The cost study and qualitative interviews with students remain Maximize relevance to field -- Provide other colleges with practical information they need when considering whether to adopt CBD-like changes (e.g. what barriers will my college face and how much does it cost) Enriching the change process study by adding more interviews with administrators and faculty and adding more observation Maintain cost study. Increase interviews with students Focus at the initiative level To afford more in-depth, qualitative data collection, go deep on representative sample of colleges that reflect diversity of the field

 Provide the higher education field with practical information to consider when aspiring to CBD-like transformation ◦ E.g. Time, money and skills to make these types of changes and how students experience the changes  Build knowledge about institutional change in higher education in general  The study is NOT designed or intended to evaluate each individual college

1. How does the systemic change envisioned by CBD occur? ◦ What changes do the colleges make in pursuing the CBD goals? ◦ What factors facilitate or inhibit change? What is the role of cross-college fertilization? 2. What does change cost? ◦ What is the cost-effectiveness of the new student pathway compared to the pre-CBD pathway? ◦ How do the colleges cover these start up and ongoing costs? ◦ What are the revenue implications? 3. How do students experience the changes?

 To develop a rich and nuanced understanding of what happens when colleges make these types of changes, we plan to deeply study the process of change in a subset of the CBD colleges  Sample colleges will be chosen to represent the diversity of community colleges nationwide. Chosen to provide diversity in size, region, student mix, degree mix and data availability.  Case study colleges will not be identified in reports; readers should see these colleges as archetypes of community colleges, not particular institutions

 Track the process of change by: ◦ Talking to administrators, staff and faculty multiple times a year ◦ Observing key activities ◦ Reviewing documents  Understand student experiences by: ◦ Talking to students multiple times a year ◦ Review changes in the KPIs  Track costs (start-up costs, maintenance costs) and revenue implications by: ◦ Reviewing budget and expenditure data ◦ Talking to key staff and administrators

 KPIs from all colleges will be used to put case study college experiences in context  All colleges will review early findings and provide input on how case study college experience compares to their own  Cross-cadre evaluation advisory committee will continue to provide feedback as well as advice on sharing findings with the field

 Jean Grossman—Project Director  Sue Scrivener—Project Manager and co-lead of the Institutional Change Study  Janet Quint—Co-lead of the Institutional Change Study  Adriana Kezar—Senior Advisor, Rossier School of Education, University of Southern California