Jennifer Reeves, Ph.D., Barbara Packer-Muti, Ed.D., & Candace Lacey, Ph.D. Nova Southeastern University Ft Lauderdale, FL.

Slides:



Advertisements
Similar presentations
Culture of Collaboration Cultivating a Campus Environment for Assessment.
Advertisements

Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
As presented to the Global Colloquium on Engineering Education Deborah Wolfe, P.Eng. October 2008 The Canadian Process for Incorporating Outcomes Assessment.
College & Career Readiness in Illinois Brian Durham Senior Director for Academic Affairs & CTE Illinois Community College Board
Arkansas Tech University Continuous Improvement Plan Office of Assessment and Institutional Effectiveness Needs Assessment Prioritize Goals &
Medical Education Grand Rounds Self-Study Overview Middle States Commission on Higher Education January 13, 2010.
All-Campus Meeting of Faculty & Staff Monday, August 19, 2013 G. Michael Pratt, Ph.D Associate Provost, Miami University Dean, College of Professional.
The AMIA-CAHIIM Partnership
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Northwest Commission on Colleges and Universities (NWCCU) Standard One Institutional Mission and Goals, Planning and Effectiveness Task Force Members Juanita.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Institutional Assessment Day: Program-Level Alumni Survey Data August 19, 2014 Pat Hulsebosch Associate Provost – Office of Academic Quality Rosanne Bangura.
Opening Doors to the Future Gateway Engineering Education Coalition Building a Program Assessment Plan Jack McGourty Associate Dean Fu Foundation School.
PAC Enhancement Project Presentation Core Team: Mr. Richard Chandler, Dr. Polly Hoover, Dr. Maria Jaskot-Inclan, Dr. Ileo Lott, Dr. Pervez Rahman, and.
NASPA Presentation Practical Tools for Building Division-wide Assessment Capacity Adrienne Dumpe, Graduate Assistant, VPSA Katie O’Dair, Director of Assessment.
Competency-based Instructional System Design: A Model for Program Assessment 3 rd Annual Texas A&M Assessment Conference Barbara Lyon, Ed.D., SPHR Tarleton.
PRESIDENT’S REPORT Academic Senate Carol Kimbrough, MA, MFT November 25, 2014.
Report on present status of the quality assurance system at University of Split Željko Dujić, MD, PhD Vice rector for science and international affaires.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
Academic Program Review Chair’s Workshop John E. Sawyer, Ph.D. Associate Provost Institutional Research and Effectiveness.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
On-line briefing for Program Directors and Staff 1.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
ASEE Profiles and Salary Surveys: An Overview
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Report on present status of the quality assurance system at University of Split Željko Dujić, MD, PhD Vice rector for science and international affaires.
KSU’s Quality Enhancement Plan.  Current Core Requirement 2.12  The institution has developed an acceptable Quality Enhancement Plan (QEP) that (1)
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Barbara Packer-Muti, EdD Candace H. Lacey, PhD Jennifer Reeves, PhD.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
Academic Program Review Workshop 2017
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Office of Academic Assessment Update
Accreditation Self-Study
Taught Postgraduate Program Review
Quality Assurance in Higher Education
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
Depaerment of Management and marketing
AQIP Category Two: Meeting Student and Other key stakeholder needs
The Year-one Report: Principles, Issues, Implications
Heidi Estrem, Ph.D. Professor of English
Approval of Assessments
Engaging Constituents: Using Assessment Data to Inform Practice
SUNY Applied Learning Campus Plan Parts V-VII
COD Institutional Effectiveness Process (IEP)
Leadership Development:
Assessing Library Performance:
D Adapted from: Kaplan & Norton The YCCD District Mission, Vision, Values & Goals are Foundational to College Planning. All College EMP work aligns.
Designing and Implementing Local Faculty Development Programs
2018 Accreditation Criteria Revisions
Derek Herrmann & Ryan Smith University Assessment Services
UMKC General Education Revision - Background June 7, 2016
Time Line for Program Reviews
Program Review Workshop
New Certificate Program
Approval of Assessments
To achieve improvement through: Self assessment Benchmarking
Time Line for Program Reviews
ASSESSMENT Overview January 30, 2006 and February 1, 2006
Graduate Education Toward a Culture of Assessment
Faculty Senate President’s Report
Taught Postgraduate Program Review
Cyclical Program Review
GC University Lahore Quality Enhancement Cell
Program Modification “Academic Year 2019” Assumption University
Co-Curricular Assessment
Assumption University
Presentation transcript:

Jennifer Reeves, Ph.D., Barbara Packer-Muti, Ed.D., & Candace Lacey, Ph.D. Nova Southeastern University Ft Lauderdale, FL

History and Background Purpose and Goals Collaboration Development Initial questions Input from all constituents Additional Questions and Accreditation Issues Final Approval Pilot Test Survey Implementation

Nova Southeastern University (NSU) Facts: The largest independent institution of higher education in the Southeast and 6 th largest in the nation Accredited by SACS Awards associate’s, bachelor’s, master’s, specialist, doctoral, and first-professional degrees in a wide range of fields. 14 academic centers with over 28,000 students (and 60,000 “active” alumni)

The desire to be preeminent and the push from accrediting bodies has led NSU to establish an Academic Review process. Assessment of Student Learning Outcomes: History and Background of Program Graduation Requirements, Rates and Persistence Assessment Plan and Results on Student Learning Comparison of Student Achievement by Location and/or Modality Follow-Up of Graduates of the Program Strategies for Improving Student Learning

Purpose Consistency for ASLO Minimize survey fatigue Increase response rate Include input and expertise across centers Measure longitudinal trends Goals Focus on learning outcomes in the program Keep to 15 minutes

Four different proposals Performa Higher Education Small task force administrators (2), researchers (2), faculty (1), alumni development officer (1)

Started with questions from Performa’s database (over 400) previous ASLO reports Educational Testing Service (ETS) Grouped questions (e.g., satisfaction, curriculum, faculty, academic support, etc.) Removed duplicates those not directly related to learning outcomes

Deans appointed representative(s) Task force individually rank ordered questions using a rubric Questions with highest ratings were included in the initial draft Performa developed the electronic survey with the 24 questions

Multiple meetings with reps to review and provide feedback on verbiage, organization, and sequence. After 17 drafts(!) the core questions were finalized and approved.

Up to 6 additional center-specific questions Presented a problem for some 21 different professional accreditation bodies (not including SACS) across the University. Individual programs had separate surveys to collect the necessary information for accreditation. It was decided each program/school could include as many additional questions as they needed Center-specific and in some cases, program-specific (wide range, 0 – 32)!

Performa finalized the electronic survey to include all questions. The survey was forwarded for final approval.

Approximately 25 alumni (1-3 from each center). Participants were asked: How long did it take? Were any questions confusing or difficult to understand? Any changes?

Survey should be sent from the director of the program (NOT Performa or President) more personable results will be valued Include definitions for buttons in survey (e.g., next) Make it clear answering about most recent degree program (as some have earned 3 degrees). Verbiage suggestions

Feedback/suggestions from pilot were reviewed and implemented. Final surveys were sent to all constituents for final approval Alumni Outcomes Questionnaire will be disseminated (by Performa) to all alumni February Future questionnaires disseminated annually with all alumni (possibly)

Jennifer Reeves, Barbara Packer-Muti, Candace Lacey, Performa Higher Education Ryan Morabito, Senior Consultant, Director of Marketing