Program Quality Assessment Duane K. Larick North Carolina State University Council Of Graduate Schools New Deans Institute July, 2007.

Slides:



Advertisements
Similar presentations
Overview Tuning Defined Tuning in the US The Tuning Process Benefits of Tuning Why Tuning is Different.
Advertisements

Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Commission for Academic Accreditation 1 Accreditation and Academic Quality King Fahd University of Petroleum and Minerals Faculty Workshop Accreditation,
Ohio Improvement Process (OIP) Your Local School District District Team Orientation Date Time.
Orientation for New Site Visitors CIDA’s Mission, Value, and the Guiding Principles of Peer Review.
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
Deanne Gannaway Facilitating Change in Higher Education Practices.
Institutional Effectiveness Operational Update Presentation made to the Indiana State University Board of Trustees October 5, 2001.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Presented to Strategic Planning Committee April 5, 2012.
Quality Assurance Review Team Oral Exit Report School Accreditation Bayard Public Schools November 8, 2011.
Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
2008 Adobe Systems Incorporated. All Rights Reserved. Developing an eLearning Strategy at a Nigerian University By Jerome Terpase Dooga, Christopher Tony.
© 2003 IBM Corporation July 2004 Technology planning for not-for-profit organizations IBM volunteer name Title, organization.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
HIDALGO COUNTY ASSET MAPPING AND STRATEGIC PLANNING PROJECT Kickoff Meeting
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Middle States Accreditation at UB Jason N. Adsit Director, Teaching and Learning Center Michael E. Ryan Director, University Accreditation and Assessment.
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Assessment & Review of Graduate Programs - Doctoral Duane Larick, NC State University Michael Carter, NC State University Margaret King, NC State University.
Program Review In Student Affairs Office of the Vice President Division of Student Affairs Virginia Tech
Successfully Aligning Resources With Planning League of Innovation Conference March 10, 2013 Greg Nelson Vice President of Administrative Services Tammeil.
HECSE Quality Indicators for Leadership Preparation.
Strategic Academic Visioning and Empowerment (SAVE) Final Report to UWF BOT December 2011.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
SUBMITTED TO THE HIGHER LEARNING COMMISSION OF THE NORTH CENTRAL ASSOCIATION OF COLLEGES AND SCHOOLS MAY 2010 Progress Report on Outcomes Assessment.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Integrated Planning = Reacting, Reflecting, Recharging.
July 2007 National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Role of Action Planning in The Developmental.
Quality Assurance Review Team Oral Exit Report School Accreditation AUTEC School 4-8 March 2012.
Yes, It’s Time!  10 years after the most recent visit ( )  (probably spring semester)  SMSU proposes dates; HLC replies  Much to be.
UNC Deans Council The North Carolina K-12 Digital Learning Transition Glenn Kleiman Friday Institute for Educational Innovation NC State University College.
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Action Planning Workshop January 2007.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Strategic Planning Linked to Long Range Planning Presentation to Faculty and Staff February 13-14, 2008
STRATEGIC PLANNING FOR THE AUBURN UNIVERSITY SYSTEM University Senate July 8, 2008.
Friday Institute Leadership Team Glenn Kleiman, Executive Director Jeni Corn, Director of Evaluation Programs Phil Emer, Director of Technology Planning.
Indiana University Kokomo Strategic Enrollment Management Consultation Final Report Bob Bontrager December 8, 2007.
Board Assessment Governing Board Online Training Module.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
UTPA 2012: A STRATEGIC PLAN FOR THE UNIVERSITY OF TEXAS-PAN AMERICAN Approved by President Cárdenas November 21, 2005 Goals reordered January 31, 2006.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
College Success Program John Cowles, Ph.D. Dean of Student Success and Retention Grand Rapids Community College Grand Rapids, Michigan.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Focused Midterm Report
Institutional Effectiveness Plan
Assessing Academic Programs at IPFW
Accreditation follow-up report
Presentation transcript:

Program Quality Assessment Duane K. Larick North Carolina State University Council Of Graduate Schools New Deans Institute July, 2007

Assessment and Review  Outline of Presentation Why review/assess graduate programs A 2- phase review process

Why Review/Assess Graduate Programs  External Considerations National Image – high quality programs yield superior graduates To help satisfy calls for accountability  Especially at the State level Requirement for regional accreditation, licensure, etc.

Why Review/Assess Graduate Programs  External Considerations – Current Debate Between Margaret Spelling - Commission on the Future of Higher Education Senate Education Committee - Higher Education Act Council for Higher Education Accreditation

Why Review/Assess Graduate Programs  Internal Considerations Meet long-term (strategic) institutional/departmental goals  Funding allocation/reallocation Meet short-term (tactical) objectives or targets  Program quality improvement  Helps chart new program directions Advanced understanding of factors influencing graduate education  Aids in identification of common programmatic needs or issues (retention/attrition)

Program Quality Assessment - A Two-Phase Process  External Review – completed on a periodic basis  Internal Review - continuous and ongoing outcomes based assessment

Marilyn J. Baker Revised and Updated by: Margaret King, Duane Larick, and Michael Carter NC State University

Key Features of Periodic Reviews  Objective  Evaluative, not descriptive  Forward-looking: focus on improvement of program  Based on program’s academic strengths and weaknesses, not just ability to attract funding  Action-oriented: clear, concrete recommendations to be implemented

Questions Answered by Periodic Review  Is the program advancing the state of the discipline or profession?  Is its teaching and training of students effective?  Does it meet institutional goals?  Does it respond to the profession’s needs?  How is it assessed by experts in the field?

Issues to be Resolved Before Beginning  Locus of control  Graduate-only or comprehensive program review  Counting—and paying—the costs  Master’s and doctoral programs  Coordination with accreditation reviews  Scheduling the reviews  How to handle interdisciplinary and interdepartmental programs

 Clear, Consistent Guidelines The purpose of graduate program review The process to be followed A generic agenda for the review The use to which results will be put Key Elements of a Successful Periodic Review

 Administrative Support Central administrative support for larger review process Adequate and accurate institutional data, consistent across programs Departmental resources: time, funding, secretarial help, etc. Key Elements of a Successful Periodic Review

 Program Self-Study Provide a clear outline of what should be included Engage the program faculty in a thoughtful evaluation of:  The program’s purpose(s)  The program’s effectiveness in achieving these purposes  The program’s overall quality  The faculty’s vision for the program  How they propose to get there Key Elements of a Successful Periodic Review

 Review Committee On-Campus Representation  A representative of the Graduate School  Internal reviewer from a field that gives him/her some understanding of the program(s) being reviewed External Reviewer(s)  Number of reviewers depends on scope and kind of review  Selection process can vary – programs can have input but should not make the final decision Key Elements of a Successful Periodic Review

 Final Report by Review Team (outcome 1) Brief overview of program Strengths of program Areas for improvement Specific recommendations for implementation Key Elements of a Successful Periodic Review

 Program Faculty’s Response to Report (outcome 2) Clear up errors or misunderstandings Respond to the recommendations (have implemented, will implement, will consider implementing, cannot implement and why) Key Elements of a Successful Periodic Review

 Implementation (outcome 3) Discussion of the recommendations with program faculty for buy-in One or more meetings of key administrators (department, college, graduate school) to discuss recommendations An Action Plan or memorandum of understanding drawn up and agreed on by all participants Meeting with key university administrators included to discuss action items that cannot be addressed at the program/college level Integration of the action plan into the institution’s long-range planning and budget process Key Elements of a Successful Periodic Review

 Follow Up An initial report on progress toward implementation of Action Plan (1 or 2 years out) Follow-up reports until Action Plan is implemented or priorities change Discussion of recommendations and implementation in self-study for next review Key Elements of a Successful Periodic Review

Outcomes-Based Assessment  What is outcomes assessment?  Benefits of outcomes assessment  Issues to be addressed before beginning  Outcomes assessment: A 6-step process to implement

What is Outcomes Assessment?  It is a process that engages program faculty in asking three questions about their programs What are our expectations for the program? To what extent is our program meeting our expectations? How can we improve our program to better meet our expectations?

What is Outcomes Assessment?  It is a process that provides program faculty the means to answer these questions By creating outcomes for their program By gathering and analyzing data to determine how well the program is meeting the outcomes By applying the results of their assessment toward improving their program

Features of Successful Outcomes Assessment  It should give faculty full ownership of the process  It should make clear who will see assessment results and how they will be used  It should focus on program improvement

Benefits of Outcomes Assessment  It gives faculty a greater sense of ownership of their programs  It provides stakeholders a clearer picture of the expectations of programs  It helps institutions meet accreditation requirements

Issues to be Addressed Before Beginning  Locus of control  Coordination with other assessment efforts How will it be integrated with Periodic Review?  Administrative resources Planning Training Data management Reporting Operational support Program improvement support

Outcomes Assessment: 6-Step Process 1. Establish objectives and outcomes 2. Identify data for assessing outcomes 3. Create and implement an assessment plan 4. Evaluate assessment data and implement change 5. Report results of assessment at regular (relatively short) intervals 6. Modify/revise outcomes and assessment plan as needed

Summary: Keys to Success of Program Review  The faculty should want to do this process  The faculty must use the information collected  Demonstrate change as a result of findings  The institution must use the information collected  It should somehow tie to resource decisions

Questions & Discussion

Managerial Tools Created for Program Review – Review Document Management

Three Common Objectives  Developing students as successful professionals in the field  Developing students as effective researchers in the field  Maintaining/enhancing the overall quality of the program

Example for Outcome 1 – Professional Development 1. To enable students to develop as successful professionals for highly competitive positions in industry, government, and academic departments, the program aims to provide a variety of experiences that help students to: a.achieve the highest level of expertise in XXXX, mastery of the knowledge in their fields and the ability to apply associated technologies to novel and emerging problems b.present research to local, regional, national, and international audiences through publications in professional journals and conference papers given in a range of venues, from graduate seminars to professional meetings c.participate in professional organizations, becoming members and attending meetings d.broaden their professional foundations through activities such as teaching, internships, fellowships, and grant applications