Tribal Colleges & Universities Chief Academic Officers 3 rd Annual Meeting 1 Wisdom Sharing: Assessment and Academic Program Review Dr. Koreen Ressler,

Slides:



Advertisements
Similar presentations
Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
What Did We Learn About Our Future? Getting Ready for Strategic Planning Spring 2012.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Division of Student Affairs and Enrollment Management Supporting Student Success and Retention.
Commission for Academic Accreditation 1 Accreditation and Academic Quality King Fahd University of Petroleum and Minerals Faculty Workshop Accreditation,
Presenters: Lisa McLaughlin, Institutional Data Coordinator Best Practices: Program Review TCUs Chief Academic Officers Annual Meeting.
Assessment Policy Overview Dwayne Holford Coordinator, Academic Affairs.
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
Bringing the World to UNO: Global Learning & Engagement Quality Enhancement Plan (QEP) SACSCOC Committee Presentation.
ARIZONA WESTERN COLLEGE ASSESSMENT & PROGRAM REVIEW.
Ivy Tech and the HLC Assessment Academy Learning College Conference February 26-27, 2009.
An Assessment Primer Fall 2007 Click here to begin.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
The Academic Assessment Process
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Wisdom Sharing: Retention and Enrollment Management Koreen Ressler, PhD Vice President of Academics.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
CAA’s IBHE Program Review Presentation April 22, 2011.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Central Virginia Community College Where your future begins.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Institutional Effectiveness &. Institutional Effectiveness & Strategic Planning IE & SP Committees have developed a new system that integrates these two.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
1 BASIC SKILLS INITIATIVE A statewide effort to bring focus and funding to foster student success in a student’s path from developmental education to college-level.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Student Outcomes Assesment for CTE Programs. Student Outcomes Assessment and Program Review  In 2004, the Program Review process was redesigned based.
ELearning Committee Strategic Plan, A Brief History of the ELC Committee Developed and Charged (2004) CMS Evaluation and RFP Process (2004)
What do you do when your assessment data are in? Using Assessment Data to Improve Teaching and Learning 1.
Key requirements Focused on student learning Inclusive Transformative Five year duration Strong assessment component.
UWF SACS REAFFIRMATION OF ACCREDITATION PROJECT Presentation to UWF Board of Trustees November 7, 2003.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
FECS 2007 Assessment and its Role in Accreditation Deborah Whitfield Frances Bailie Adel Abunawass.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
CAMPUS AND COMMUNITY OPEN SESSION MARCH 25 Higher Learning Commission Re-accreditation.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Institutional Effectiveness at CPCC DENISE H WELLS.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
+ Montgomery College Program Assessment Orientation Spring 2013.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
TELL Survey 2015 Trigg County Public Schools Board Report December 10, 2015.
HLC Criterion Three Primer: Teaching and Learning: Quality, Resources, and Support Thursday, September 24, :40 – 11:40 a.m. Event Center.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
Annual Program Assessment With a Five- Year Peer Review John Henik Associate Vice President, Academic Affairs Dave Horsfield
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Wisdom Sharing: Retention and Enrollment Management
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Presenters: Lisa McLaughlin, Institutional Data Coordinator
HLC Day February 13, 9:30am - 2:00pm, TUC Great Hall.
Student Learning Outcomes Assessment
What to do with your data?
Presentation transcript:

Tribal Colleges & Universities Chief Academic Officers 3 rd Annual Meeting 1 Wisdom Sharing: Assessment and Academic Program Review Dr. Koreen Ressler, Vice President of Academics Sitting Bull College

HLC Criterion Criterion Four. Teaching and Learning: Evaluation and Improvement The institution demonstrates responsibility for the quality of its educational programs, learning environments, and support services, and it evaluates their effectiveness for student learning through processes designed to promote continuous improvement. Core Components 4.A. The institution demonstrates responsibility for the quality of its educational programs. 1. The institution maintains a practice of regular program reviews. 2

Program Review Purpose 1. Report for degree programs and certificates. 2. Purpose of report: –Analysis –Evaluation –Improvement 3

Information in Program Review The role of the program within the institution Staff/faculty Student information – program numbers, retention, persistence, graduation rates, graduate employment data Revenue and budget information Future needs – who is all involved in planning for the program – advisory committee 4

SBC Program Review Process 5

Curriculum at SBC FUNCTION: Recommend academic and instructional policy to the Board of Trustees. SCOPE: Covers all matters of instructional policy, programs, and activities as they relate to the curriculum. Goal # 1: To provide and refine a systematic evaluation of current academic and technical programs through Objective 1: Assign programs to the annual review for the year. Objective 2: Review & revise curricular components of the college catalog Goal #2: To explore and evaluate future academic and technical programs through Objective 1: Evaluate & review potential new courses. Objective 2: Evaluate & review potential new programs. Objective 3: Explore online/hybrid delivery of course and/or program offerings. 6

SBC Program Review Evaluation Criterion Evaluation completed by Curriculum Committee –Maintain a program –Enhance a program –Reconfigure a program –Reduce or phase-out a program. Five year cycle for all programs, unless recommendations are made by curriculum committee to complete within a designated time. 7

HLC Criteria 4.B. The institution demonstrates a commitment to educational achievement and improvement through ongoing assessment of student learning. 1. The institution has clearly stated goals for student learning and effective processes for assessment of student learning and achievement of learning goals. 2. The institution assesses achievement of the learning outcomes that it claims for its curricular and co-curricular programs. 3. The institution uses the information gained from assessment to improve student learning. 4. The institution’s processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty and other instructional staff members. 8

Diagram for Assessment of Student Learning 9 Establish Learning Goals Provide Learning Opportunity for Students Assess Student Learning Use the Results to Implement Change

Goals (Intended) –What do you want your students to know upon completion – need to connect to mission Institutional General Education Program Outcomes (Achieved) –Describe essential learning that students have achieved and can reliably demonstrate at the end of a program. Goals versus Outcomes 10

Establishment of Program Outcomes Are program outcomes based on industry standards? –Advisory Committee input Are the program outcomes precise, specific, and measureable? 11

Measurement of Program Outcomes What do students complete throughout the program that will provide evidence of mastery of program outcomes? –Pre and Post Tests –National Tests - National Center for Construction Education and Research (NCCER) Health Education Service Incorporated (HESI Test) –Internships/Practicums –Self Assessments –Projects –Portfolios 12

Principal Indicators for Assessment Sitting Bull College’s assessment is broken down into four areas: institution wide, pre-entry and freshman level, general education, and program. 1. Institution-Wide Assessment a.Enrollment Trends b.Persistence and Retention rates c.Tracking of Student Withdrawals d.Student Satisfaction Survey (Noel-Levitz) or Community College Survey of Student Engagement (every other year for each survey) e.Student Service Satisfaction Graduate Survey f.Satisfaction of Institutional Outcomes Graduate Survey g.Graduation Rates//IPEDS/AKIS h.Employer Survey i.Alumni Survey 2. Pre-entry and Freshmen Assessment a. COMPASS placement (pre) scores b.1st Year Freshman Advising c. 1st Year Experience Course d.Freshman Orientation Evaluation e.Enrollment Trends 3. General Education Assessment a. General Education Outcomes Assessment Plan – English, Speech, Computers, NA Language, Science, Math b. Post COMPASS results c. Completion Rates 4. Program Assessment a.Program Assessment Plans & one page papers b.Program Reviews c.Retention/Persistence – report on program review d.Graduation rates – report on program review. e.Employer Survey 13

Assessment at SBC FUNCTION: Review, report and make recommendations concerning student learning and institutional effectiveness for continual quality improvement for all our stakeholders. SCOPE: To oversee all institutional data collection and recommend new data that will measure institutional effectiveness. Goal #1: To review academic and student support data that demonstrates institutional effectiveness through Objective 1: Annually review program assessment data which supports the continued improvement for student learning. Objective 2: Annually review essential learning outcomes (general education) data which supports the continued improvement for student learning. Objective 3: Meet monthly during the academic year to review assessment data that may be available at the time and/or plan for needed data collection to assist in data driven decisions. Objective 4: Annually review Student Support Services data including the Enrollment Management Plan which supports the continued improvement of student learning. 14

Annual Plan Program/General Education Program Outcomes Measurement Tool (Who, what, how, when?) Measurement Goal (expected results) Findings (Actual results) Analysis of Data (What students learned and what they didn't learn) Action or Recommendation 15

Rubric for Annual Review of Program/General Education Plans 16 Performance Criteria No Evidence 0 Emerging 1 Developing 2 Achieving 3 Comments Program Outcomes Competencies/program outcomes are unclear Over 50% discipline/program outcomes are clear and understandable Over 75%competencies/ program outcomes are clear and understandable Measurement Tools Measurement tool is not clear on answering the “Who, What, How, and When” Measurement tool is over 75% clear on answering the “Who, What, How, and When” Measurement tool clearly answers the “Who, What, How, and When” Measurement Styles Competencies/Outcomes only have indirect measures Competencies/Outcomes only have direct measures Competencies/Outcomes have direct and indirect measures *Faculty Please note that you will NOT be scored on this criteria this year, but it will be applicable for the academic year! Measurement Goal (Expected Results) Measurement goal is not clearly stated and is not obtainable Measurement goal is either not obtainable or not clearly stated Measurement goal is clearly stated and obtainable Findings (Actual Results) There are no actual results for all of the measurement goals. There are over 50% actual results for all the measurement goals. There are 75% of results for all measurement goals.

Rubric Continued Performance Criteria No Evidence 0 Emerging 1 Developing 2 Achieving 3 Comments Analysis of the Results Analysis states the relationship between actual and expected results Analysis states the relationship between actual and expected results and describes what it means Analysis states the relationship between actual and expected results and describes what it means. Strengths and opportunities for improvement are identified Recommended Action(s) Outcomes have actions identified Outcomes showing concerns have recommended actions listed Outcomes showing concerns have detailed recommended actions assigned to individuals to be accomplished by a given date. Data analysis is interpreted to justify recommended actions. Results of Last Year’s Recommended Actions Some actions implementedAll actions implemented All actions implemented as assigned and completed on time. Analysis of effectiveness included. Strengths Opportunities 17

Direct Measures Instruments in which students demonstrate what they have achieved or learned related to explicitly stated learning outcomes. All involve the evaluation of actual student performance vis-à-vis stated learning outcomes. –Standardized tests –Locally developed tests –Essay tests –Projects –Juried exhibits –Oral presentations –Performance in internship 18

Indirect Measures Measures which rely on perceptions or opinions about student learning. – Surveys (employer, alumni, student) – Exit interviews – Focus groups – Global indicators of student achievement (graduate rates, job placement rates) 19

Examples – Good and Average Program –Nursing high rating –Energy Technology lower rating – 2.23 General Education –English & Communication high rating –Science lower rating – 2.42 Select a partner and discuss the program review and assessment process at your institutions. 20

Thank You For additional information –Contact: Dr. Koreen Ressler