Assessment & Review of Graduate Programs - Doctoral Duane Larick, NC State University Michael Carter, NC State University Margaret King, NC State University.

Slides:



Advertisements
Similar presentations
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Advertisements

Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
A Self Study Process for WCEA Catholic High Schools
Campus Improvement Plans
Commission for Academic Accreditation 1 Accreditation and Academic Quality King Fahd University of Petroleum and Minerals Faculty Workshop Accreditation,
Orientation for New Site Visitors CIDA’s Mission, Value, and the Guiding Principles of Peer Review.
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
An Assessment Primer Fall 2007 Click here to begin.
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
Assessment & Review of Graduate Programs
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
1 Dissertation & Comprehensive Exam Process Dissertation Process Comprehensive Exam.
College Strategic Plan by Strategic Planning and Quality Assurance Committee.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
1 Dissertation Process 4 process overview 4 specifics –dates, policies, etc.
Standards and Guidelines for Quality Assurance in the European
Professional Growth= Teacher Growth
Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
University Of North Alabama General Education Assessment Paradigm Shift: A plan for Revising General Education Assessment at UNA.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
SAR as Formative Assessment By Rev. Bro. Dr. Bancha Saenghiran February 9, 2008.
Conceptual Framework for the College of Education Created by: Dr. Joe P. Brasher.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
Program Review In Student Affairs Office of the Vice President Division of Student Affairs Virginia Tech
Promoting the Success of a New Academic Librarian Through a Formal Mentoring Program The State University of West Georgia Experience By Brian Kooy and.
Welcome! Please join us via teleconference: Phone: Code:
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
HECSE Quality Indicators for Leadership Preparation.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
South Western School District Differentiated Supervision Plan DRAFT 2010.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
On-line briefing for Program Directors and Staff 1.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
UWF SACS REAFFIRMATION OF ACCREDITATION PROJECT Presentation to UWF Board of Trustees November 7, 2003.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Council for the Advancement of Standards in Higher Education CAS Standards and Self- Assessment in Higher Education Tony Ellis, CAE Director of Education,
Program Quality Assessment Duane K. Larick North Carolina State University Council Of Graduate Schools New Deans Institute July, 2007.
October 20 – November 6, 2014 Alovidin Bakhovidinov Alina Batkayeva
Standards of Achievement for Professional Advancement District 2 Career Ladder Training April 29, 2016 Ronda Alexander & Michael Clawson.
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
Dutchess Community College Middle States Self-Study 2015
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
Assessing Academic Programs at IPFW
Presentation transcript:

Assessment & Review of Graduate Programs - Doctoral Duane Larick, NC State University Michael Carter, NC State University Margaret King, NC State University Council Of Graduate Schools Pre-Meeting Workshop December 7, 2005

Guidelines for This Presentation Please feel free to raise questions at anytime during the presentation We have included discussion questions along the way We are very interested in your participation through questions and sharing of experiences from your campus We will also leave time at the end for general discussion.

Agenda Introduction and Objectives Overview of Graduate Program Review Reasons for Graduate Assessment General Process of Program Review Process or Processes for Development of a Program Review Procedure External program review Outcome based – continuous & ongoing review Comparative Data Sources Summary and Discussion

Objectives Discuss various motivators for undertaking graduate assessment Increase overall awareness of recent trends in Graduate Program Review Demonstrate practical experience/knowledge gained related to development and implementation of external reviews and outcome-based continuous and ongoing procedures for Graduate Program Review Illustrate examples of data and managerial tools developed/utilized to improve the efficiency of the process

Background Information About Our Audience How many of you are responsible for graduate program review at your institutions? How many of you have this as a new responsibility? How many of you have recently (or are considering) changing your procedure?

Why Assess Graduate Programs?

The primary purpose should be to improve in the quality of graduate education on our campuses By creating a structured, scheduled opportunity for a program to be examined, program review provides a strategy for improvement that is well-reasoned, far- seeking, and as apolitical as possible

Why Assess Graduate Programs? External Drivers: To help satisfy calls for accountability Especially at the State level

State Mandated Evaluation of New Programs All new degree program proposals must include an evaluation plan that includes: the criteria to be used to evaluate the quality and effectiveness of the program measures to be used to evaluate the program expected levels of productivity of the proposed program for the first four years of operation (number of graduates) a plan and schedule to evaluate the proposed new degree program prior to the completion of its fifth year

State-Mandated 5 th -Year Review - Issues  Statewide Productivity Assessment of Graduate Programs  Capacity in Relation to Student Demand  Capacity in Relation to Occupational Demand  Centrality in Relation to Instructional Mission  Success of Graduates  Program Costs

Low Productivity Analysis - Elements of Statewide Analysis for Each Program Area to be Reviewed Trends in enrollment and degrees granted Student characteristics Program costs Occupational demand Recommendations for expansion or elimination of programs on a statewide basis

Why Assess Graduate Programs? External Drivers: Requirement for regional accreditation, licensure, etc.

Regional Accreditation Agencies Southern Association of Colleges and Schools Western Association of Colleges and Schools Northwest Association of Colleges and Schools North Central Association New England Association of Schools and Colleges Middle States Commission on Higher Education

SACS Principles of Accreditation Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

SACS Criterion for Accreditation Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.

Western Association of Schools & Colleges Accreditation Standards 1.2. Educational objectives are clearly recognized throughout the institution and are consistent with stated purposes. The institution has developed indicators and evidence to ascertain the level of achievement of its purposes and educational objectives The institution employs a deliberate set of quality assurance processes at each level of institutional functioning, including new curriculum and program approval processes, periodic program review, ongoing evaluation, and data collection. These processes involve assessments of effectiveness, track results over time, and use the results of these assessments to revise and improve structures and processes, curricula, and pedagogy.

Intent of Accreditation Agency Effort The intent of the regional accrediting agencies is to “encourage” institutions to create an environment of planned change for improving the educational process

Other Accreditation Agencies Education, Architecture, Engineering, etc. Often focused on minimum standards required Department approach to development of the self-study and the review is focused on demonstration of achievement of those standards – not necessarily program improvement

Why Assess Graduate Programs? Internal Drivers: Meet short-term (tactical) objectives or targets Enrollment Growth & Funding Meet long-term (strategic) institutional or departmental goals Funding allocation/reallocation Understand sources of retention/attrition among students and faculty Funded project evaluation (GAANN, IGERT)

Discussion Questions What other external and internal drivers exist on your campuses?

So The Questions We Need To Ask Ourselves Are What are we currently doing? Why are we currently doing it? Is what we are currently doing accomplishing the external goals described above? Is what we are currently doing accomplishing the internal goals described above? Is there a better way? Who defines better?

Procedure(s) for Review of Doctoral Graduate Programs External program review conducted on a 5 – 10 year cycle Standard practice at most Institutions Outcome-based continuous and ongoing program review Being implemented by many in response to regional and state accreditation requirements and institution needs

Key Characteristics of External Program Reviews Program review is evaluative, not just descriptive More than merely a compilation of data, it requires academic judgment of the data Review of graduate programs is forward looking It is directed toward improvement of the program, not simply assessment of its current status

Key Characteristics of External Program Reviews - continued Programs should be reviewed on the basis of academic strengths and weaknesses, not on their ability to generate funding Finances and funding should be relevant only as they affect the quality of the academic program To the extent possible, program review should be an objective process

Key Characteristics of External Program Reviews - continued Graduate program review should be an independent process, distinct from any other review Efficiency can be gained by incorporating graduate program review with other internal or external reviews but, to be effective, graduate program review must lead to its own set of conclusions and direct its recommendations to the faculty and administrators who have the power to improve the graduate program

Key Characteristics of External Program Reviews - continued Most importantly, program review MUST result in action Based on the self-study, reviewers’ comments and recommendations, and faculty and administrator response to the review report, the institution develops and agrees on a plan to implement the desired changes This plan must be linked to the institution’s planning and budget process

Successful Graduate Program Review Answers the Following Questions Is the program advancing the state of the discipline or profession? Is its teaching and training of students effective? Does the program meet the institution’s goals? Does it respond to the profession’s needs? How is it assessed by experts in the field?

Operational Procedures: year review cycle Components Internal self-study - report External team review Review team’s report Program’s response Administrative Meeting General Process for External Reviews

Issues to be Resolved Before Beginning Program Reviews Locus of Control – Administration of Review Process Comprehensive reviews are often coordinated by the office of the college or school dean or the chief academic officer Graduate program reviews are often coordinated by the graduate dean

Issues to be Resolved Before Beginning Program Reviews - continued Regardless of who controls the review, the following principles should apply: All reviews should involve the college or school administration The graduate dean should play a major leadership role in all graduate reviews The essential participants in any graduate program review are the chief academic officer, college administration, graduate dean, department chair, graduate program administrator, graduate program faculty, review team(s) and graduate students in the program

Issues to be Resolved Before Beginning Program Reviews - continued Counting – and Paying – the Costs A realistic estimate of the costs must be made and an agreement must be reached regarding who will pay them Costs include: Travel, accommodations and meals for reviewers, honoraria for reviewers, etc. Costs for developing and reproducing review documents, etc.

Issues to be Resolved Before Beginning Program Reviews - continued Graduate Versus Overall Program Review? Advantages to graduate-only review Allows for a thorough, in-depth review of the graduate program Attention focused on quality indicators unique to graduate education No risk of the graduate program review being “overwhelmed” by the size of the undergraduate program

Graduate Versus Overall Program Review? Advantages to comprehensive review Potential savings in time and money Does not subject departments to multiple separate reviews Graduate and undergraduate programs, as well as research and outreach activities are interdependent Matters like faculty teaching loads, program and departmental budgets, facilities and quality of teaching and research experience may be more adequately addressed Issues to be Resolved Before Beginning Program Reviews - continued

Scheduling Reviews More well-meaning plans for graduate program review have foundered on an unworkable timetable than any other obstacle! Recommendation is a 5 – 7 year cycle This depends on the number of programs and resources available Programs may be grouped by department, college, etc. for review. The review “unit” should be established prior to scheduling

Issues to be Resolved Before Beginning Program Reviews - continued Scheduling Reviews – continued Factors to consider in determining the order of programs for review: Length of time since last review Compelling financial problems or resource needs Major proposals for curricular change Upcoming accreditation or external reviews Faculty or administration desire for review

Issues to be Resolved Before Beginning Program Reviews - continued Scheduling Reviews – continued The schedule MUST be published far in advance Programs generally need 9-12 months to prepare the self-study, etc. Once established, every effort should be taken to maintain the schedule, BUT things happen!

Issues to be Resolved Before Beginning Program Reviews - continued Coordination With Accreditation Reviews Graduate program reviews should be a separate process from accreditation reviews, but much can be gained by conducting them in tandem, sequentially, or at least in the same academic year: Efficiency of data collection Graduate program review team can benefit from the expertise and report of the accreditation team When done in tandem, it is extremely important that the accreditation team acknowledge the difference(s) in the nature of the two reviews

Issues to be Resolved Before Beginning Program Reviews - continued Masters Versus Doctoral Programs Whether it leads to a doctoral program or not, a master’s degree should have its own academic integrity At those institutions with research-oriented master’s and doctoral programs in the same department, programs at both levels should be reviewed simultaneously The institution should examine the unique characteristics of each master’s program and develop criteria of evaluation appropriate for that program

Issues to be Resolved Before Beginning Program Reviews - continued Research Based Versus Practitioner Graduate Program Reviews Traditional research-based and practitioner programs often exist within the same department Despite the differences in their educational goals, they should be reviewed together It is essential that they be reviewed using different criteria Should not rely on the use of professional accreditation review in place of internal review

Issues to be Resolved Before Beginning Program Reviews - continued Interdisciplinary Programs Truly interdisciplinary programs cause special problems for review Faculty and students are often arranged into academic departments Those academic departments often control resources, faculty hiring, student admissions, course offerings, etc. In spite of the administrative convenience of working through existing departments, interdisciplinary programs should be reviewed independently

Issues to be Resolved Before Beginning Program Reviews - continued Integration of Formal Review with Continuous Outcomes Assessment It is important that formal review and continuous and ongoing assessment be seen as part of the same whole, with a common goal of improving graduate education To accomplish this, they should somehow be coordinated and integrated We will discuss how we do that at NC State later in the presentation

Discussion Questions What other issues have you had to resolve on your campuses? How have you resolved them?

Clear, Consistent Guidelines These guidelines should describe: The purpose of graduate program review The process to be followed Guidelines for materials to be included in each phase A generic agenda for the review The use to which results will be put These guidelines should be posted on the Graduate School or Academic Affairs web page Key Elements of a Successful Program Review

Administrative Support Adequate staffing and general administrative support are vital to the success of any program review Departments can provide their own support for the self-study The larger review process should be staffed centrally Key Elements of a Successful Program Review

Administrative Support - continued Successful reviews depend on accurate institutional data This data should be developed and maintained centrally but should be reviewed and evaluated by the program faculty A standard report format using a single set of definitions should be developed in advance The best information often comes from a combination of central and departmental sources Key Elements of a Successful Program Review

Managerial Tools Created for Program Review - Website

Managerial Tools Created for Program Review - Profile Data

Managerial Tools Created for Program Review - Website

Departmental Self-Study The self-study is prepared by the faculty and is descriptive, evaluative, and aspirational It is the department’s opportunity to scrutinize itself, publicize its accomplishments, examine its flaws, and focus on future directions Key Elements of a Successful Program Review

Key Self-Study Components Departmental mission & organization Program purpose Program Assessment Plan Department size – faculty, staff, students, budgets, etc. Faculty profile Faculty accomplishments – research & scholarly activity, contributions to graduate program Key Elements of a Successful Program Review

Key Self-Study Components - continued Student profile Professional development opportunities – faculty and students Financial support for graduate students Facilities Curriculum Student productivity Key Elements of a Successful Program Review

Key Self-Study Components - continued Programmatic climate Collateral support – interaction with other programs Profile of graduates Future directions Overall evaluation of program – strengths, weaknesses, national reputation, etc. Key Elements of a Successful Program Review

Surveys/Questionnaires Surveys from current students, alumni, and employers can provide valuable information Factors to be considered: Time and expense to develop, distribute & collect responses Likely response rate Uniqueness of information to be gained It is generally preferable to have such surveys developed and administered at the institutional level Key Elements of a Successful Program Review

Student Participation Graduate students should participate in the program review process Serve on the review committee Be interviewed collectively and individually by the review committee Key Elements of a Successful Program Review

Review Team Make-up On-Campus Representation Often a Graduate School and/or Graduate Faculty Representative If possible, they should be from fields that give them some understanding of the program(s) being reviewed One or more off-campus external experts Depends on scope of program(s) being reviewed Will add to expense – honorarium plus travel expenses Selection process can vary – programs can have input but should not make the final decision Key Elements of a Successful Program Review

Review Team Report Generally includes some form of an analysis of the strengths, weaknesses, and opportunities for and needs of the graduate program from the perspective of their peers Should include recommendations for change and improvement Key Elements of a Successful Program Review

Program Faculty’s Response It is important to keep the program faculty informed about the findings and to give them a chance to comment on the evaluation This gives the faculty a chance to correct any factual errors and to reply to any specific criticisms or recommendations This also gives faculty a chance to outline their proposed actions as a result of the findings Key Elements of a Successful Program Review

Implementation The most important step in program review is not to produce the report but to implement its recommendations! Turning recommendations into actions involves the following steps: One or more meetings of key administrators (department, college, graduate school, and university administration) to discuss the recommendations An action plan or memorandum of understanding drawn up and agreed on by all participants Discussion of the recommendations with program faculty for implementation Integration of the action plan into the institution’s long- range planning and budget process Key Elements of a Successful Program Review

Follow Up Since most improvements take time, it is essential to establish a procedure to monitor progress towards completion of the action plan This is generally done at one- or two-year intervals Key Elements of a Successful Program Review

Discussion Questions What other key elements are missing from the process of formal (external) reviews I described?

Discussion Questions continued How many of your institutions have a graduate program review process similar to what was just described? What are some of the variations that exist? How often or what is the frequency of review – remember the words “continuous improvement”

Graduate Program Review at NC State – External Review Until 02-03, we basically followed the formal review process described Beginning it we started to develop and implement a continuous and ongoing, outcome based review process to compliment our external reviews

Motivations For Change Growing culture of program improvement on our campus –general education, undergraduate, graduate Undergraduate Student Affairs had implemented an outcomes-based review program that was operational SACS was just around the corner

SACS Principles of Accreditation Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research- based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

SACS Criterion for Accreditation Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.

Questions We Began to Ask Ourselves Does each of our degree programs have clearly defined outcomes? Are they measurable? Do our programs gather data to assess the achievement of program outcomes? Do they use assessment results to improve programs? Do we document that we use assessment results to improve programs?

Ultimate Question for NC State Became How could we create a hybrid that combined the benefits of periodic program review and outcomes-based program assessment? Accomplish administrative goals regarding evaluation of quality related to funding and institutional goals Accomplish graduate school goals related to program improvement The ultimate goal is to improve educational programs, not fill out reports to demonstrate accountability

Studying & Revising the Process  Graduate Dean Appointed a Task Force  Made up of stakeholders  Relied on on-campus expertise  Focus groups with administrators, faculty, students, etc.  Could not utilize Undergraduate Program Review personnel  Work load issue  New perspectives  Bottom Line – The opportunity for change is at the faculty level, so we want the process to address improvement at that level.

What We Decided to Do Continue the traditional external review program on an 8-year schedule Continue to partner with external reviews already conducted for accreditation or other purposes Emphasize development of program-specific outcomes and assessment procedures to determine if they are being achieved

What We Decided to Do- continued In addition to the external program review we will require each program to: Develop program-specific objectives and outcomes Develop an assessment plan outlining the assessment activities they will conduct Collect and analyze data on a regular basis Complete biennial assessment reports that are submitted online

What We Decided to Do - continued Provide the training and support necessary for programs to implement these changes Phase I: Creating Assessment Plans Identify diverse programs for pilot Work with pilot programs to create assessment plans Offer all DGPs workshops based on pilot materials Provide support for creating assessment plans (individual work, workshops, online management tool) Phase II: Implementing Assessment Plans Identify at least one pilot program in each college Work with programs to collect, analyze, and report data Offer all DGPs workshops based on pilot materials Provide support for implementing assessment plans

What We Decided to Do - continued Increase efforts relative to follow-up after the graduate program review – assess progress on recommendations Tie the annual assessment and biennial reports to the external review by incorporating the changes made as a result of assessment into the self-study Emphasize an “Action Plan” Agreed upon by University, Graduate School, College and Department Administration

What is Outcomes-Based Assessment? It entails a shift in emphasis from inputs to outcomes It is continuous rather than periodic It involves regular reports of program assessment to the institution Its results are used by the program and institution for gauging improvement and for planning

What is Outcomes-Based Assessment? It is a process that engages program faculty in asking three questions about their programs What are our expectations for the program? To what extent is our program meeting our expectations? How can we improve our program to better meet our expectations? It is a process that provides program faculty the means to answer these questions By creating objectives and outcomes for their program By gathering and analyzing data to determine how well the program is meeting the objectives and outcomes By applying the results of their assessment toward improving their program

Potential Benefit of Assessment Planning Process It is a faculty-driven process that Gives faculty a voice in defining the program and thus a greater stake in the program Gives faculty an investment in assessing the program Provides faculty-approved indicators for gauging and improving the effectiveness of the program

What Are Objectives?  Program objectives are the general goals that define what it means to be an effective program.

Three Common Objectives Developing students as successful professionals in the field Developing students as effective researchers in the field Maintaining/enhancing the overall quality of the program

What Are Outcomes?  Program outcomes are specific faculty expectations for each objective that define what the program needs to achieve in order to meet the objectives.

Example for Outcome 1 – Professional Development 1. To enable students to develop as successful professionals for highly competitive positions in industry, government, and academic departments, the program aims to provide a variety of experiences that help students to: a.achieve the highest level of expertise in XXXX, mastery of the knowledge in their fields and the ability to apply associated technologies to novel and emerging problems b.present research to local, regional, national, and international audiences through publications in professional journals and conference papers given in a range of venues, from graduate seminars to professional meetings c.participate in professional organizations, becoming members and attending meetings d.broaden their professional foundations through activities such as teaching, internships, fellowships, and grant applications

Example for Outcome 2 – Effective Researchers 2.To prepare students to conduct research effectively in XXXX in a collaborative environment, the program aims to offer a variety of educational experiences that are designed to develop in students the ability to: a.read and review the literature in an area of study in such a way that reveals a comprehensive understanding of the literature b.identify research questions/problems that are pertinent to a field of study and provide a focus for making a significant contribution to the field c.gather, organize, analyze, and report data using a conceptual framework appropriate to the research question and the field of study d.interpret research results in a way that adds to the understanding of the field of study and relates the findings to teaching and learning in science Etc.

Example for Outcome 3 – Quality of Program 3. To maintain and improve the program’s leadership position nationally and internationally, the program aims to: a.continue to be nationally competitive by attracting high- quality students b.provide effective mentoring that encourages students to graduate in a timely manner c.place graduates in positions in industry and academics d.maintain a nationally recognized faculty that is large enough and appropriately distributed across XXXX disciplines to offer students a wide range of fields of expertise

Four Questions for Creating an Assessment Plan 1. What types of data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

Types of Data Used 1. Take advantage of what you are already doing  Preliminary exams  Proposals  Theses and dissertations  Defenses  Student progress reports  Student course evaluations  Faculty activity reports  Student exit interviews

Types of Data Used 2. Use Resources of Graduate School and Your Institutional Analysis Group  Enrollment statistics  Time-to-degree statistics  Student exit data  Ten-year profile reports  Alumni surveys

Types of Data Used Use your imagination to find other types of data Dollar amount of support for faculty Student cv’s Faculty surveys

Data: Two Standards to Use in Identifying Data 1. Appropriateness: Data should provide information that is suitable for assessing the outcome 2. Accessibility: Data should be reasonable to attain (time, effort, ability, availability, resources)

Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

Sources of Data Students Faculty Graduate School Graduate Program Directors Department Heads Registration and Records University Information Technology University Planning and Analysis

Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

Frequency of Data Collection Every semester Annually Biennially When available from individual graduate students At the preliminary exam At the defense At graduation

Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

Creating a Timeline for Analyzing Assessment Data According to objective: year 1-objective 1; year 2-objective 2; year 3-objective 3; year 4- objective 1; etc. (3-year cycle) More pressing outcomes earlier and less pressing ones later Outcomes easier to assess earlier and outcomes requiring more complex data gathering and analysis later Approximately the same workload each year of the assessment cycle

Creating a Timeline for Reporting Assessment Data Standard practice appears to be to call for a short annual or biennial assessment report Longer cycles lose the impact on the continuous and ongoing nature  When possible combine with pre-existing external review program; including assessment reports as part of the self-study is recommended

Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

Questions to Guide Biennial Assessment Report What outcomes did you plan to assess for the most recent reporting period? What outcomes assessments were completed? What data did you collect and from what sources? What did you learn about your program and/or your students from your assessments? As a result of your assessment, what initiatives, if any, did you implement or do you propose to implement to address areas of concern? How will you measure the success of these initiatives? What outcomes assessments are you planning for the upcoming reporting period?

Training Workshops Provided Graduate Program Review – Where we are, Where we are headed, and why? Assessing the Mission of Doctoral Research Universities (a workshop on outcomes-based assessment put on by outside experts) Creating Outcomes and Objectives Creating an Assessment Plan Utilizing the Graduate School Managerial Tools Developing an Institutional Database for Assessment of Graduate Programs – to be developed

Managerial Tools Created for Program Review - Website

Managerial Tools Created for Program Review - Profile Data

Managerial Tools Created for Program Review – Review Document Management

Revised Review Process Implemented at NC State Initial Year 1 (Start-Up) Development of objectives, outcomes and assessment tools Identification of data sources and beginning of data collection Cycle Year 3 (also 5 and 7) Continued data collection pertinent to outcomes and assessment measures Compact Initiatives Cycle Year 2 (also 4 and 6) Ongoing assessment & self- study by grad faculty Programmatic changes Brief biennial assessment report Cycle Year 8 (program review) Self-study report External review Review report Program response Action plan

What We Have Learned/ Discussion Points The process of change takes time We have been at this for almost four years (since the start of the Task Force) and have just started collecting the first biennial reports Communication is the key to success Clearly communicated goals and expectations are important It is important to pilot assessment processes before taking it to all graduate programs.

What We Have Learned/ Discussion Points continued This kind of review process must be ground (faculty) up not top (administration) down Even then faculty may be skeptical about work loads versus value – they must be able to see the the process is both meaningful and manageable This kind of review process requires significant human resources Training, data collection, analysis, and interpretation, etc. A key to our success is how much of this can be institutionalized

Discussion Questions How many of your institutions have an outcomes-based graduate program review process? How many of you are considering implementing such a review program? What do your programs (in place or under consideration) look like? What are some of the variations that exist across universities?

Discussion Questions continued What kinds of faculty training have you provided? How successful is it? What kinds of accountability have you instituted? If reports, how often are they due? What are some of the problems you have encountered, or fear that you will encounter, in establishing outcomes-based assessment? What has been the level of campus buy-in?