Assessment Plan Tune-up

Slides:



Advertisements
Similar presentations
What is Assessment? The university, with the support and encouragement of President Hitt and the Vice President team, engages in the institutional effectiveness.
Advertisements

GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Apples to Oranges to Elephants: Comparing the Incomparable.
Dallas Baptist University College of Education Graduate Programs
FLCC knows a lot about assessment – J will send examples
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 C-99 Assessing Student Learning in Graduate Degree Programs C-99 Assessing Student Learning in Graduate Degree Programs Bob Smallwood, University of.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
BY Karen Liu, Ph. D. Indiana State University August 18,
Commission on Teacher Credentialing Inspire, Educate, and Protect the Students of California Commission on Teacher Credentialing 1 Accreditation Overview.
Promoting and Communicating Learning Quality at NKU Mapping Student Learning Outcomes and Assessing Student Learning D. Kent Johnson, PhD.
Program Review In Student Affairs Office of the Vice President Division of Student Affairs Virginia Tech
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
HECSE Quality Indicators for Leadership Preparation.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
IPFW Assessment Workshop Module 2: Mapping Programmatic SLO’s D. Kent Johnson, PhD Director of Assessment.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Module 3: Programmatic Assessment Strategies
A Commitment to Continuous Improvement in Teaching and Learning Michaela Rome, Ph.D. NYU Assistant Vice Provost for Assessment.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
Overview of process, reporting structure and support for programmatic assessment under SD D. Kent Johnson, PhD Director of Assessment Assessing Academic.
CIAS Program Level Assessment Office of Educational Effectiveness Assessment September 6, 2016.
IPFW Assessment Academy Programmatic Assessment Workshop Series - Signature Assignments D. Kent Johnson, PhD Director of Assessment.
D. Kent Johnson, PhD Director of Assessment Reviewing Departmental Assessment Reports.
Academic Program Review Workshop 2017
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Advanced Writing Requirement Proposal
Academic Program Review
Texas Woman’s University Academic Degree Program Assessment
Making the most of the assessment cycle and template
Introduction to Curriculum Mapping
CRITICAL CORE: Straight Talk.
The assessment process For Administrative units
Consider Your Audience
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Developing a Student Learning Outcomes Assessment Plan and Report
Director of Policy Analysis and Research
Effective Outcomes Assessment
Assessment of Student Learning
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Teaching and Learning Commons West Virginia University
UMKC General Education Revision - Background June 7, 2016
Jo Lynn Autry Digranes Coordinator for Assessment Updated 10/2017
Program Assessment Processes for Developing and Strengthening
Assessment and Accreditation
Introduction to Student Achievement Objectives
General Education Assessment Revision Plan Proposal
Director of Assessment
Presented by: Skyline College SLOAC Committee Fall 2007
General Education Assessment Revision Plan Proposal
Assessing Academic Programs at IPFW
What to do with your data?
D. Kent Johnson, PhD Assess-Intervene/Innovate-Reassess
Using the PFW General Education Assessment Template
CUNY Graduate School and University Center
Student Learning Outcomes at CSUDH
IPFW Assessment Academy
Presentation transcript:

Assessment Plan Tune-up Fall 2016 Assessment Academy Workshop 1 D. Kent Johnson, PhD Director of Assessment

Todays Objectives Review outline for the Assessment Report (Part 3 of the Annual Assessment Report) Discuss how to ensure the assessment plan measures the progression of student learning relative to programmatic SLO’s as they matriculate to graduation Examine how present assessment plan leads to actionable assessment findings that can be used to improve student success through programmatic interventions and innovations Plan an assessment schedule to cycle through assessing, intervening, and reassessing student achievement of all SLO’s in a 3 to 4 year cycle Briefly discuss how cumulative reporting contributes to the completion of the Academic Program Review (if we have time)

HLC Assessment Expectation Statements For student learning, a commitment to assessment would mean assessment at the program level that proceeds from clear goals, involves faculty at all points in the process, and analyzes the assessment results; it would also mean that the institution improves its programs or ancillary services or other operations on the basis of those analyses.  Higher Learning Commission. “The Criteria for Accreditation: Guiding Values”. 2015. Qualified faculty should also be aware of whether and how much students learn through the ongoing collection and analysis of appropriate data, because an institution should be able to demonstrate its commitment to educational achievement and improvement through ongoing assessment of student learning. Higher Learning Commission. “Determining Qualified Faculty through HLC’s Criteria for Accreditation and Assumed Practices: Guidelines for Institutions and Peer Reviewers”. October, 2015.

A Review of Department/Program Level Assessment Activities and Results Assessing, Making Interventions, and Reassessing to Improve Programmatic Capacity for Improving Student Learning

IPFW Undergraduate Program Assessment Model

Annual Academic Program Assessment Report Outline (SD 15-6) Overview of Programmatic Student Learning Outcomes (Appendix D, Section I) Curricular Maps (Institutional to Baccalaureate Framework and Programmatic to Core Courses or Specific Curricular Mileposts (Appendix D, Section II and III) Assessment Plan (Appendix D, Section IV) Assessment Results (Appendix D, Section V) Conclusions, Next Steps, and Communication (Appendix D, Section VI and VII)

Assessment Reporting Sections 1,2, and 3 should be fairly static (require minimal changes) over time Sections 3 and 4 reporting are informed by the PLAIR Model emphasizing a process of “assess-intervene-reassess” (Fulcher, Good, Coleman, and Smith, 2015). Sections 5 and 6 draw conclusions to demonstrate the improving quality of the program, suggest how the faculty plan on continuing to improve program support of student learning, and communicate to internal and external constituents what students are learning

Assessment Plan Outline* Description of Departments Assessment Model describing the department’s plan for assessing student learning (how, when, and where) relative to Programmatic SLO’s (assess), using assessment results to improve student learning in the program (intervene), and evaluate how changes impacted student learning (reassess). (Fulcher, et.al., 2015). Identification and description of the assessment measures a department plans to use (e.g. – direct assessment of student learning embedded in common course assignments, direct assessment of student portfolios, externally validated professional exams, indirect assessments of student perceptions of learning gains) – need to have two with at least one being a direct measure Description of how assessment products will be evaluated (e.g. rubrics, primary trait analysis, metrics, etc.). Description of how assessment findings will be used to improve student learning * For support see Assessment Handbook and Workbook (ipfw.edu/assessment)

Developing the Assessment Plan: Designing the Assessment Plan – Part 1 Review the curriculum map for your program, does it identify levels of learning expected of students as they progress through the curriculum. If not, ask – are your programmatic SLO’s written on the basis of expectations at graduation? Let’s choose one outcome. One way to think about assessing achievement at programmatic level is to assess somewhere early (100 or 200 level course, in the middle (a 300 level course) and toward the end (perhaps a capstone course or experience) This type of strategy provides the type of evidence you need to identify where learning might be improved For the most reliable assessment, think in terms of what types of products students should produce as evidence of learning – this will begin to help determine the measures that you will use.

Developing the Assessment Plan: Designing the Assessment Plan – Part 2 Direct measures assess student achievement relative to the outcome – based on student “demonstrating” achievement Indirect measures assess a student or faculty member’s perception of achievement relative to the outcome Determining the type of measure forms the foundation for selecting or designing the assessment instrument – or the type of product students will produce to measure The expected level of learning might be expressed as a rubric, matrix, objective statement, etc. – this will largely depend on conventions of your particular discipline, The Plan should define how this will be measured How and when will you measure student learning and a description of how you plan to use findings to guide programmatic change – (e.g. discussions in department meeting leading to curricular changes) – planning interventions and innovations Finally, a description of how will you re-assess the interventions or innovations made

Developing the Assessment Plan: Planning the Assessment Cycle Programmatic SLO Year Assessed Intervention Planned Y/N Year Reassessed

Part 2: Reviewing your plan Begin with the Assessment Plan Evaluation Rubric from Appendix D Work back from it to the SLO’s Work forward to your assessment reporting

Appendix D-IV: Assessment Plan Exemplary 3 Acceptable 2 Developing 1 Score Relationship between assessments and SLOs Detail is provided regarding SLO- to-measure match. Specific items included on the assessment are linked to SLOs. The match is affirmed by faculty subject experts. Description of how SLOs relate to assessment is general but sufficient to show alignment. Description of how SLOs relate to assessment is incomplete or too general to provide sufficient information for use in determining progress toward SLO. Types of Measures All SLOs are assessed using at least two measures including at least one direct measure. Most SLOs are assessed using at least one direct measure. Most SLOs are either assessed using only indirect measures or are not assessed. Established Results Statements of desired results (data targets) provide useful comparisons and detailed timelines for completion. Statements of desired results provide a basic data target and a general timeline for completion. Statements of desired results are missing or unrealistic for completion. Data Collection and Design Integrity The data collection process is sound, clearly explained, and appropriately specific to be actionable. Enough information is provided to understand the data collection process with limited methodological concerns. Limited information is provided about the data collection process or includes sufficient flaws to nullify any conclusions drawn from the data. Evidence of Reliability of Measures Methods used to ensure reliability of findings are clearly explained and consistently support drawing meaningful conclusions. Methods used to ensure reliability of findings are stated and generally support drawing meaningful conclusions. Methods to ensure reliability of findings are insufficient for drawing meaningful conclusions.

Appendix D-III: Programmatic Curriculum Map Exemplary 3 Acceptable 2 Developing 1 Score Content Alignment All SLOs are mapped to common classes or learning activities expected of all students completing the program. Most SLOs are mapped to common classes or learning activities expected of all students completing the program. Common classes or learning activities are identified for all students completing the program but most SLOs are not clearly mapped to classes or activities. Student Learning Development of SLOs (Learning Benchmarks) Curricular Map clearly identifies the progression of student learning relative to all SLOs at specific points in the curriculum. Curricular Map identifies levels of expected learning relative to most SLOs at specific points in the curriculum. Curricular Map identifies expected levels of learning for some SLOs at specific points in the curriculum. Student Engagement Classes and/or activities engage students in the work outlined in the SLOs. Classes and/or activities engage students in the work outlined by most of the SLOs. Classes and/or activities do not consistently engage students in the work outlined by most of the SLOs.

Appendix D-II: Alignment with Baccalaureate Framework Exemplary 3 Acceptable 2 Developing 1 Score IPFW Baccalaureate Framework Alignment Specific, clearly defined, student- centered Program-Level SLOs are aligned to all foundation areas of the IPFW Baccalaureate Framework. Generally defined student- centered Program-Level SLOs are aligned to all foundation areas of the IPFW Baccalaureate Framework. Program-Level SLOs are aligned to some foundation areas of the IPFW Baccalaureate Framework.

Appendix D-I: Clearly Stated Student Learning Outcomes Exemplary 3 Acceptable 2 Developing 1 Score Clarity and specificity All SLOs are stated with clarity and specificity including precise verbs and rich descriptions of the knowledge, skills and value domains expected of students upon completing the program SLOs generally contain precise verbs, rich description of the knowledge, skills and value domains expected of students. SLOs are inconsistently defined for the program, descriptions of the knowledge, skill and value domains are present but lack consistent precision. Student-Centered All SLOs are stated in student-centered terms (i.e. what a student should know, think, or do). Most SLOs are stated in student-centered terms. Some SLOs are stated in student-centered terms. Expectation Level SLOs exceed basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet the basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet only a portion of the expectations established by the University or other necessary approving organizations required of the submitting unit.

Appendix D-V: Reporting Results Exemplary 3 Acceptable 2 Developing 1 Score Presentation of Results Results are clearly present and directly related to SLOs. Results consistently demonstrate student achievement relative to stated SLOs. Results are derived from generally accepted practices for student learning outcomes assessment. Results are present and related to SLOs. Results generally demonstrate student achievement relative to stated SLOs. Results are derived from generally accepted practices for student learning outcomes assessment. Results are provided but do not clearly relate to SLOs. Results inconsistently demonstrate student achievement relative to stated SLOs. Use of generally accepted practices for student learning outcomes assessment is unclear. Historical Results Past iterations of results are provided for most assessments to provide context for current results. Past iterations of results are provided for the majority of assessments to provide context for current results. Limited or no iterations of prior results are provided. Interpretation of Results Interpretations of results are reasonable given the SLOs, desired levels of student learning and methodology employed. Multiple faculty interpreted the results including an interpretation of how classes/activities might have affected the results. Interpretations of results are reasonable given the SLOs, desired levels of student learning and methodology employed. Multiple faculty interpreted the results. Interpretation of results does not adequately refer to stated SLOs or identify expectations for student learning relative to SLOs. The interpretation does not include multiple faculty.

Appendix D-VI: Report Dissemination and Collaboration Exemplary 3 Acceptable 2 Developing 1 Score Documents and results are shared with faculty Information is routinely provided to all faculty with multiple opportunities for collaboration to build meaningful future plans. Information is provided to all faculty through an effective mode and with sufficient detail to be meaningful. Information is not distributed to all faculty or provides insufficient detail to be meaningful. Documents and results are shared with other stakeholders Information is routinely provided to stakeholders (beyond faculty) with multiple opportunities for collaboration to build meaningful future plans. Information is shared with stakeholders (beyond faculty) through an effective mode and with sufficient detail to be meaningful. Information is not distributed to stakeholders (beyond faculty) or provides insufficient detail to be meaningful.

College Level Report Template Assessment Report Section 1: Summary of findings detailing scores of all academic departments/programs of the college. Section 2: Summary of recommendations made to academic departments/programs based on their assessment findings. Section 3: Summary of results of changes made or actions taken as a result of prior year findings including results of student learning and a summary of impact (positive or negative). Section 4: Conclusions providing an overall evaluation of assessment in the College and a description of changes in process planned to improve the quality of student learning.

Part 3: Supporting Programmatic Assessment The Office of Assessment, IPFW Assessment Academy, and Assessment Council as the Departmental Support Team

Overview of Institutional Assessment Team Office of Assessment – Coordinates and provides support for assessment activities The Assessment Council – Shared governance focused group of faculty charged with recommending policy and providing oversight of assessment process. The IPFW Assessment Academy Leadership Team – Faculty group charged with creating and supporting learning cohorts, programs, workshops, and other activities focused on developing and improving assessment at IPFW.

Support Available to Academic Departments IPFW Assessment Academy – Cohort based community to support departments through a full cycle of assessment 20116-17 Workshop Series – Delivered in general sessions and available as Department/Academic Program Focused Series “on-demand” and “just-in-time” based on department/program needs Blackboard Course and Assessment Website – Supports the Workshop Series and can be “stand-alone” Assessment Director – Provides ongoing and “on-demand” support, resources, and leadership for programmatic and course level assessment