Assessing Academic Programs at IPFW

Slides:



Advertisements
Similar presentations
Campus Improvement Plans
Advertisements

Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
SEM Planning Model.
Apples to Oranges to Elephants: Comparing the Incomparable.
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 C-99 Assessing Student Learning in Graduate Degree Programs C-99 Assessing Student Learning in Graduate Degree Programs Bob Smallwood, University of.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Academic.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Why Do State and Federal Programs Require a Needs Assessment?
Welcome to Curriculum Mapping… The attendee will understand the purpose of curriculum mapping, with a focus on the alignment of instruction with desired.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
SUBMITTED TO THE HIGHER LEARNING COMMISSION OF THE NORTH CENTRAL ASSOCIATION OF COLLEGES AND SCHOOLS MAY 2010 Progress Report on Outcomes Assessment.
IPFW Assessment Workshop Module 2: Mapping Programmatic SLO’s D. Kent Johnson, PhD Director of Assessment.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Module 3: Programmatic Assessment Strategies
East Longmeadow Public Schools SMART Goals Presented by ELPS Leadership Team.
HLC Criterion Five Primer Thursday, Nov. 5, :40 – 11:40 a.m. Event Center.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Overview of process, reporting structure and support for programmatic assessment under SD D. Kent Johnson, PhD Director of Assessment Assessing Academic.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
CIAS Program Level Assessment Office of Educational Effectiveness Assessment September 6, 2016.
IPFW Assessment Academy Programmatic Assessment Workshop Series - Signature Assignments D. Kent Johnson, PhD Director of Assessment.
D. Kent Johnson, PhD Director of Assessment Reviewing Departmental Assessment Reports.
Academic Program Review Workshop 2017
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Erik Shearer, Professor of Art, Accreditation Faculty Co-chair
Academic Program Review
Texas Woman’s University Academic Degree Program Assessment
Making the most of the assessment cycle and template
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
Introduction to Curriculum Mapping
UCL Annual Student Experience Review
The assessment process For Administrative units
Assessment & Evaluation Committee
Consider Your Audience
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Developing a Student Learning Outcomes Assessment Plan and Report
Program Quality Assurance Process Validation
Assessment Plan Tune-up
Curriculum and Accreditation
Teacher Evaluation “SLO 101”
DESE Educator Evaluation System for Superintendents
Institutional Effectiveness USF System Office of Decision Support
Teaching Excellence Development Fund
Student Learning Outcomes Assessment
Introduction to Student Achievement Objectives
General Education Assessment Revision Plan Proposal
Presented by: Skyline College SLOAC Committee Fall 2007
Assessment & Evaluation Committee
General Education Assessment Revision Plan Proposal
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
February 21-22, 2018.
Using the PFW General Education Assessment Template
Fort Valley State University
Presentation transcript:

Assessing Academic Programs at IPFW Overview of process, reporting structure and support for programmatic assessment

Assessment Reporting Process Assessment Council Assessment Council Reviews College Level Reports and Prepares Annual Report of Assessment Progress for Each College Assessment Council Reviews Academic Assessment Report Using IPFW Assessment Progress Worksheet for General Education Subcommittee College Level Assessment Committee General Education Subcommittee College Level Assessment Committee Reviews Academic Department Assessment Report General Education Committee Reviews Course Level Assessments to Prepare Academic Assessment Report Academic Program General Education Academic Program Submits Academic Department Assessment Report to College Assessment Committee Academic Program Prepares Assessment Summary by Course and Submits to General Education Subcommittee

Annual Assessment Report Outline Section 1: Student Learning Outcomes for the Program (SD 15-6, Appendix D Section I) Section 2: Curricular Maps a. Map of Programmatic SLO’s to Baccalaureate Framework (Appendix D, Section II) b. Map of Programmatic SLO’s to Identified “core courses” in the curriculum (Appendix D, Section III)

Annual Assessment Report Outline Section 3: Assessment Plan a. Description of Departments Assessment Model b. Measures used c. Rubrics or evaluation metric descriptions d. Dissemination plan

Annual Assessment Report Outline Section 4: Assessment Results a. Current year assessment findings b. Proposed changes to address findings c. Prior year findings and assessment of changes d. Assessment findings for curricular changes made Section 5: Conclusions, next steps, and reporting plan

Department/Program Level Assessment Activities and Results Assessing, Making Interventions, and Reassessing to Improve Programmatic Capacity for Improving Student Learning

SLO’s, Maps, and The Assessment Plan Sections 1,2, and 3 should be fairly static (require minimal changes) over time Section 4 reporting is informed by the PLAIR Model emphasizing a process of “assess-intervene-reassess” (Fulcher, Good, Coleman, and Smith, 2015). Sections 5 and 6 draw conclusions to demonstrate the improving quality of the program, suggest how the faculty plan on continuing to improve program support of student learning, and communicate to internal and external constituents what students are learning.

Part 2: College Level Review of Departmental Assessment Rubrics for Evaluating Academic Program Review from Appendix D of Proposed Revision to SD 98-22

Appendix D-I: Clearly Stated Student Learning Outcomes Exemplary Acceptable Developing Not Present Clarity and specificity All SLOs are stated with clarity and specificity including precise verbs and rich descriptions of the knowledge, skills and value domains expected of students upon completing the program SLOs generally contain precise verbs, rich description of the knowledge, skills and value domains expected of students. SLOs are inconsistently defined for the program, descriptions of the knowledge, skill and value domains are present but lack consistent precision. Student-Centered All SLOs are stated in student-centered terms (i.e. what a student should know, think, or do). Most SLOs are stated in student-centered terms. Some SLOs are stated in student-centered terms. Expectation Level SLOs exceed basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet the basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet only a portion of the expectations established by the University or other necessary approving organizations required of the submitting unit.

Appendix D-II: Alignment with Baccalaureate Framework Exemplary Acceptable Developing Not Present IPFW Baccalaureate Framework Alignment Specific, clearly defined, student- centered Program-Level SLOs are aligned to all foundation areas of the IPFW Baccalaureate Framework. Generally defined student- centered Program-Level SLOs are aligned to all foundation areas of the IPFW Baccalaureate Framework. Program-Level SLOs are aligned to some foundation areas of the IPFW Baccalaureate Framework.

Appendix D-III: Programmatic Curriculum Map Exemplary Acceptable Developing Not Present Content Alignment All SLOs are mapped to common classes or learning activities expected of all students completing the program. Most SLOs are mapped to common classes or learning activities expected of all students completing the program. Common classes or learning activities are identified for all students completing the program but most SLOs are not clearly mapped to classes or activities. Student Learning Development of SLOs (Learning Benchmarks) Curricular Map clearly identifies the progression of student learning relative to all SLOs at specific points in the curriculum. Curricular Map identifies levels of expected learning relative to most SLOs at specific points in the curriculum. Curricular Map identifies expected levels of learning for some SLOs at specific points in the curriculum. Student Engagement Classes and/or activities engage students in the work outlined in the SLOs. Classes and/or activities engage students in the work outlined by most of the SLOs. Classes and/or activities do not consistently engage students in the work outlined by most of the SLOs.

Appendix D-IV: Assessment Plan Exemplary Acceptable Developing Not Present Relationship between assessments and SLOs Detail is provided regarding SLO- to-measure match. Specific items included on the assessment are linked to SLOs. The match is affirmed by faculty subject experts. Description of how SLOs relate to assessment is general but sufficient to show alignment. Description of how SLOs relate to assessment is incomplete or too general to provide sufficient information for use in determining progress toward SLO. Types of Measures All SLOs are assessed using at least two measures including at least one direct measure. Most SLOs are assessed using at least one direct measure. Most SLOs are either assessed using only indirect measures or are not assessed. Established Results Statements of desired results (data targets) provide useful comparisons and detailed timelines for completion. Statements of desired results provide a basic data target and a general timeline for completion. Statements of desired results are missing or unrealistic for completion. Data Collection and Design Integrity The data collection process is sound, clearly explained, and appropriately specific to be actionable. Enough information is provided to understand the data collection process with limited methodological concerns. Limited information is provided about the data collection process or includes sufficient flaws to nullify any conclusions drawn from the data. Evidence of Reliability of Measures Methods used to ensure reliability of findings are clearly explained and consistently support drawing meaningful conclusions. Methods used to ensure reliability of findings are stated and generally support drawing meaningful conclusions. Methods to ensure reliability of findings are insufficient for drawing meaningful conclusions.

Appendix D-V: Reporting Results Exemplary Acceptable Developing Not Present Presentation of Results Results are clearly present and directly related to SLOs. Results consistently demonstrate student achievement relative to stated SLOs. Results are derived from generally accepted practices for student learning outcomes assessment. Results are present and related to SLOs. Results generally demonstrate student achievement relative to stated SLOs. Results are derived from generally accepted practices for student learning outcomes assessment. Results are provided but do not clearly relate to SLOs. Results inconsistently demonstrate student achievement relative to stated SLOs. Use of generally accepted practices for student learning outcomes assessment is unclear. Historical Results Past iterations of results are provided for most assessments to provide context for current results. Past iterations of results are provided for the majority of assessments to provide context for current results. Limited or no iterations of prior results are provided. Interpretation of Results Interpretations of results are reasonable given the SLOs, desired levels of student learning and methodology employed. Multiple faculty interpreted the results including an interpretation of how classes/activities might have affected the results. Interpretations of results are reasonable given the SLOs, desired levels of student learning and methodology employed. Multiple faculty interpreted the results. Interpretation of results does not adequately refer to stated SLOs or identify expectations for student learning relative to SLOs. The interpretation does not include multiple faculty.

Appendix D-VI: Report Dissemination and Collaboration Exemplary Acceptable Developing Not Present Documents and results are shared with faculty Information is routinely provided to all faculty with multiple opportunities for collaboration to build meaningful future plans. Information is provided to all faculty through an effective mode and with sufficient detail to be meaningful. Information is not distributed to all faculty or provides insufficient detail to be meaningful. Documents and results are shared with other stakeholders Information is routinely provided to stakeholders (beyond faculty) with multiple opportunities for collaboration to build meaningful future plans. Information is shared with stakeholders (beyond faculty) through an effective mode and with sufficient detail to be meaningful. Information is not distributed to stakeholders (beyond faculty) or provides insufficient detail to be meaningful.

College Level Report Template for Assessment Council Overview of Report Section 1: Summary of findings detailing scores of all academic departments of the college. Section 2: Summary of recommendations made to academic departments based on their assessment findings. Section 3: Summary of results of changes made or actions taken as a result of prior year findings including results of student learning and a summary of impact (positive or negative). Section 4: Conclusions providing an overall evaluation of assessment n the College and a description of changes in process planned to improve the quality of student learning.

Example of College Level Reporting Part 1a: Summary of Findings Within a Department Department 1 Summary Exemplary Acceptable Developing Summary Clarity and specificity All SLOs are stated with clarity and specificity including precise verbs and rich descriptions of the knowledge, skills and value domains expected of students upon completing the program SLOs generally contain precise verbs, rich description of the knowledge, skills and value domains expected of students. SLOs are inconsistently defined for the program, descriptions of the knowledge, skill and value domains are present but lack consistent precision. Student-Centered All SLOs are stated in student-centered terms (i.e. what a student should know, think, or do). Most SLOs are stated in student-centered terms. Some SLOs are stated in student-centered terms. Expectation Level SLOs exceed basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet the basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet only a portion of the expectations established by the University or other necessary approving organizations required of the submitting unit. Recommendation Overall the quality of Student Learning Outcomes is very good. However, not all SLO’s were stated in student centered terms. For example… In addition, the expected level of learning for Programmatic SLO 1 and SLO 3 were stated in the Bloom lexicon at the “understand” level. Given the expected progression of student learning, we would expect students to advance to a level where they can apply and analyze. For example…

Example of College Level Reporting Part 1b: Summary of Findings Across Departments Exemplary Acceptable Developing College Summary Clarity and specificity All SLOs are stated with clarity and specificity including precise verbs and rich descriptions of the knowledge, skills and value domains expected of students upon completing the program SLOs generally contain precise verbs, rich description of the knowledge, skills and value domains expected of students. SLOs are inconsistently defined for the program, descriptions of the knowledge, skill and value domains are present but lack consistent precision. College Assessment Committee provides a summary of overall College Performance for each Rubric area.

Part 3: Supporting Programmatic Assessment The Office of Assessment, IPFW Assessment Academy, and Assessment Council as the Departmental Support Team

Overview of Institutional Assessment Team Office of Assessment – Coordinates and provides support for assessment activities The Assessment Council – Shared governance focused group of faculty charged with recommending policy and providing oversight of assessment process. The IPFW Assessment Academy Leadership Team – Faculty group charged with creating and supporting learning communities focused on developing and improving assessment at IPFW.

Support Available to Academic Departments 2016-17 Workshop Series – Delivered in general sessions and available as Departmentally Focused Series IPFW Assessment Academy – Cohort based community to support departments through a full cycle of assessment Assessment Director – Provides ongoing and on demand support, resources, and leadership for programmatic and course level assessment