WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Writing an IRA/NCATE SPA Report. IRA Requirements Programs must have: –Minimum of 24 credit hours of reading/literacy courses aligned with IRA Standards.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Learning Outcomes, Authentic Assessments and Rubrics Erin Hagar
Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College 1.
WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M.
PROFESSIONAL AND PROGRAMMATIC DEVELOPMENT Collaborative Assessment of Student Products.
An Assessment Primer Fall 2007 Click here to begin.
4/16/07 Assessment of the Core – Social Inquiry Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
STRATEGIC PLANNING AND ASSESSMENT PLANNING Presentation to CLAS Unit Heads Nov. 16, 2005 Maria Cimitile Julie Guevara Carol Griffin.
Process and Report Guidelines Concordia University Wisconsin Spring 2008.
Common Definitions Activities undertaken by teachers –
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
Implication of Gender and Perception of Self- Competence on Educational Aspiration among Graduates in Taiwan Wan-Chen Hsu and Chia- Hsun Chiang Presenter.
Becoming a Teacher Ninth Edition
Candidate Work Sample. Section I: Unit Topic or Title.
BY Karen Liu, Ph. D. Indiana State University August 18,
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Jason D. Powell Ferrum College Saturday, October 15, :30-2:30 PM ACA Summit Asheville, NC.
Writing Your Program’s SPA Report(s) Cynthia Conn, Ph.D., Associate Director, Office of Academic Assessment Chris Geanious, Project Director, College of.
Assessing Progress on the Quality Enhancement Plan (QEP) Quality Enhancement Committee Meeting Department of Academic Effectiveness and Assessment.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1.
HECSE Quality Indicators for Leadership Preparation.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
NEASCNEASC Standards Committees Kick-off Today’s Topic: Overview of the NEASC Game Plan and instructions for today’s work.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
EVERGREEN VALLEY COLLEGE Spring PDD, March 23, 2012 PRESENTED BY DR. ABDIE TABRIZI ENGINEERING FACULTY SLO COORDINATOR LYNETTE APEN RN, MS SLO Assessment.
Introduction to the Teacher Work Sample Portfolio Presented by Frank H. Osborne, Ph. D. © 2015 EMSE 3123 Math and Science in Education 1.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Preparing Your ELCC Assessments for NCATE Accreditation Missouri Professors of Educational Administration Conference October 10, 2008.
Updating Curriculum to Support Learning Davidson County Community College May, 2011.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
The Gold Standard… Faculty are Key.  Annual Assessment based on  Address each SLO  Be specific, measurable, student- focused  Align the new.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Library Assessment of Research Skills Instruction Tim Held Library May 13, 2011.
ASSESSMENT and EVALUATION (seeing through the jargon and figuring out how to use the tools)
Program Assessment – an overview Karen E. Dennis O: sasoue.rutgers.edu.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Click to edit Master subtitle style Competence by Design (CBD) Foundations of Assessment.
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
Academic Program Review
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Institutional Learning Outcomes Assessment
Derek Herrmann & Ryan Smith University Assessment Services
Student Learning Outcomes Assessment
AY College-Wide Learning Outcomes Results
Assessing Academic Programs at IPFW
Student Learning Outcomes at CSUDH
Curriculum Coordinator: Pamela Quinn Date of Presentation: 1/19/18
Curriculum Coordinator: Patrick LaPierre February 3, 2017
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D.

Meeting Roadmap The goals are to understand – Why assessment needs to take place – Who should be involved in assessment – What needs to be assessed – How to assess the learning outcomes – When assessment reports are due

Why does assessment need to take place? WASC recommendations “Nine colleges in search of a University” Landscape of education Why do we assess? – To measure learning – To identify challenges related to instruction, curriculum, or assignments. – To improve learning Methods must be in place to properly assess Information should be shared widely and used to inform decision- making

Who should be involved in assessment? The program – Deans – Faculty – Curriculum committees – Assessment committees – Assessment Specialists – Preceptors Assessment & Program Review Committee – Contains a representative from each college Institutional Research & Effectiveness – Director – Senior Assessment Analyst – Assessment Analyst

What needs to be assessed? INSTITUTIONAL LEARNING OUTCOMES Phase Evidence based practice Interpersonal communication skills Phase Critical thinking Collaboration skills Phase Breadth and depth of knowledge in the discipline/Clinical competence Ethical and moral decision making skills Phase Life-long learningHumanistic practice

What needs to be assessed? (cont.): We cannot assess everything! Direct assessment of Signature Assignments – Signature assignments have the potential to help us know whether student learning reflects “the ways of thinking and doing of disciplinary experts” – Course-embedded assessment – Aligned with LO’s – Authentic in terms of process/content, “real world application” Indirect assessment, i.e., Student perceptions – First year survey – Graduating survey – Alumni surveys – Student evaluation of course

ILO Assessment Template Western University of Health Sciences

Assessment Template Timeline – For programs – For Assessment Committee Section I: Progress Report Section II: Learning Outcome Alignment Section III: Methodology, Goals & Participation Section IV: Results Section V: Discussion & Implications

Section I: Progress Report Instructions: Please list any programmatic actions that have taken place as a result of last year’s assessment addressing the same Institutional Learning Outcome. Goal: To document what occurred as a result of the assessment

Section II: Learning Outcome Alignment Instructions: Please list all program learning outcomes (PLO) that align with the institutional learning outcome.

Section III: Methodology, Goals & Participation Name of assignment Type of assessment (Direct; Indirect) Full description of assignment – Narrative PLO’s (from the aforementioned list) the assignment assesses Quantifiable assessment goal(s) for assignment Type of scoring mechanism used Attachment of scoring tool highlighting what is being assessed Participation: List of titles and assessment roles for those who participated in the assessment process

Section III components PLO’s (from the aforementioned list) the assignment assesses – It is possible that not all PLO’s will be assessed by the assignment – Goal: To determine, after time, which PLO’s are/are not being assessed Quantifiable assessment goal(s) for assignment – To determine how many students are achieving at a specific level/score – To determine if differences in scores exist between two or more groups – To determine if scores from one assignment predict scores of another assignment

Section III components Type of scoring mechanism used – Scoring guide, rubric, Scantron, professional judgment Attachment of scoring tool highlighting what is being assessed – Example: Rubric Participation – Faculty, Faculty committee, Program assessment committee, Deans, Institutional Research & Effectiveness – Goal: To keep track and demonstrate program participation

Section IV: Results Name of assignment Analytical approach – Should align with assessment goal! – To determine how many students are achieving at a specific level/score: Frequency distribution – To determine if differences in scores exist between two or more groups: chi-square, t- test or ANOVA – To determine if scores from one assignment predict scores of another assignment: Regression Sample size – Number of students assessed Statistical results – Frequency table – Central tendency – Standard deviation – Test statistic – Degrees of freedom – p value

Section V: Discussion & Implications Name of assignment Restate assignment goal Was the goal reached (Yes/No)? How do the results relate back to the ILO? – Narrative How are the results being used? – Narrative

Example Scenario: Following a discussion between faculty, Curriculum Committee, the Program Assessment Committee and the Dean, it was decided Evidence-Based Practice will be assessed using 4 th year preceptor evaluations. Question: What do we need to assess this assignment?

Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Things to consider: – Which PLO does this assignment address? – How is the assignment graded? – Who has the data? – What is/are the assessment goals? Standards of success – How do we analyze the data?

Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Assignment: The preceptor evaluation of students occurs during various time points within the 4 th year rotations. For the purpose of assessment, the program has decided to use the students’ last preceptor evaluation. The preceptor is asked to indicate using a Yes/No format if a student has been observed demonstrating a list of certain skills or has been observed displaying certain knowledge elements; there are 20 total items in the evaluation form. The data is sent directly to the 4 th year Director. To assess Evidence-Based Practice, a single item within the checklist is used: The student displays evidence-based practice.

Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Assessment Goal: 90% of students will demonstrate evidence-based practice skills. Why did we come up with 90%? – For grading, students need to achieve a score of 70% or higher, and each evaluation of “Yes” = 1 point, thus 14 points out of 20 is required to pass. – It is possible for all students to score 0 on the EBP item. – For assessment purposes, we are striving for 90% of students to display EBP skills in their last rotation within the curriculum. Remember signature assignment approach

Example: Data of 4 th year preceptor evaluations to assess Evidence-Based Practice EPB Score: 0 = no, 1 =yes Gender: 1 = male, 2 =female StudentEBP ScoreGender StudentEBP ScoreGender

Example Name of assignment4 th year preceptor evaluation Type of assessment (Direct; Indirect)Direct Provide a full description of the assignment.Preceptors indicate using a Yes/No format if students are observed demonstrating a list of certain skills or display certain knowledge elements; there are 20 total items in the evaluation form Which PLO(s) from the list in Section II (above) will this assignment assess? (Please list) PLO 2 Please state the assessment goal(s) for the assignment. What is the quantifiable standard(s) of success for this assignment? 90% of students will demonstrate evidence- based practice skills When does the assignment take place in the curriculum? (Year in program, Semester) This is the very last preceptor evaluation during the 4 th year Spring semester Type of scoring mechanism usedYes/No scoring guide for the item: The student displays evidence-based practice ParticipantsFaculty, Curriculum Committee, Assessment Committee and Dean selected assignment; 4 th year preceptors evaluated students; 4 th year program director collected data; Assessment Committee analyzed data

Example: Results Name of Assignment4 th year preceptor evaluation Analytical ApproachFrequency distribution Sample SizeN=20 Statistical Result FrequencyPercent No945.0% Yes1155.0% Total %

Example: Discussion & Implications Please restate the assessment goal(s). Was the goal reached? (Yes/No) How do the results relate back to the ILO? How are the findings being used? Assignment 1: 4 th year preceptor evaluation 90% of students will demonstrate evidence-based practice skills No; Only 55% of students demonstrated evidence-based practice skills. Only a slight majority of students demonstrate evidence-based practice skills during the final phase of their education within the curriculum. The program is determining 1. If preceptors know what to look for when evaluating students, 2. If there are predictors to student success for this assignment, 3. If previous 4 th year evaluations lead to a different conclusion, 4. Rigor?

GROUP WORK TIME!!!

Timeline Timeline for Programs Distribute TemplateApril 3, 2013 Section I: Progress Report Section II: Institutional Learning Outcome & Program Learning Outcome Alignment Section III: Methodology, Assessment Goals, & Participation May 3, 2013 Section IV: ResultsJune 7, 2013 Assessment Report DueJuly 31, 2013

Questions? Concerns?