Download presentation
Presentation is loading. Please wait.
Published byRobyn Poole Modified over 9 years ago
1
WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D.
2
Meeting Roadmap The goals are to understand – Why assessment needs to take place – Who should be involved in assessment – What needs to be assessed – How to assess the learning outcomes – When assessment reports are due
3
Why does assessment need to take place? WASC recommendations “Nine colleges in search of a University” Landscape of education Why do we assess? – To measure learning – To identify challenges related to instruction, curriculum, or assignments. – To improve learning Methods must be in place to properly assess Information should be shared widely and used to inform decision- making
4
Who should be involved in assessment? The program – Deans – Faculty – Curriculum committees – Assessment committees – Assessment Specialists – Preceptors Assessment & Program Review Committee – Contains a representative from each college Institutional Research & Effectiveness – Director – Senior Assessment Analyst – Assessment Analyst
5
What needs to be assessed? INSTITUTIONAL LEARNING OUTCOMES Phase 1 2012-13 Evidence based practice Interpersonal communication skills Phase 2 2013-14 Critical thinking Collaboration skills Phase 3 2014-15 Breadth and depth of knowledge in the discipline/Clinical competence Ethical and moral decision making skills Phase 4 2015-16 Life-long learningHumanistic practice
6
What needs to be assessed? (cont.): We cannot assess everything! Direct assessment of Signature Assignments – Signature assignments have the potential to help us know whether student learning reflects “the ways of thinking and doing of disciplinary experts” – Course-embedded assessment – Aligned with LO’s – Authentic in terms of process/content, “real world application” Indirect assessment, i.e., Student perceptions – First year survey – Graduating survey – Alumni surveys – Student evaluation of course
7
ILO Assessment Template Western University of Health Sciences
8
Assessment Template Timeline – For programs – For Assessment Committee Section I: Progress Report Section II: Learning Outcome Alignment Section III: Methodology, Goals & Participation Section IV: Results Section V: Discussion & Implications
9
Section I: Progress Report Instructions: Please list any programmatic actions that have taken place as a result of last year’s assessment addressing the same Institutional Learning Outcome. Goal: To document what occurred as a result of the assessment
10
Section II: Learning Outcome Alignment Instructions: Please list all program learning outcomes (PLO) that align with the institutional learning outcome.
11
Section III: Methodology, Goals & Participation Name of assignment Type of assessment (Direct; Indirect) Full description of assignment – Narrative PLO’s (from the aforementioned list) the assignment assesses Quantifiable assessment goal(s) for assignment Type of scoring mechanism used Attachment of scoring tool highlighting what is being assessed Participation: List of titles and assessment roles for those who participated in the assessment process
12
Section III components PLO’s (from the aforementioned list) the assignment assesses – It is possible that not all PLO’s will be assessed by the assignment – Goal: To determine, after time, which PLO’s are/are not being assessed Quantifiable assessment goal(s) for assignment – To determine how many students are achieving at a specific level/score – To determine if differences in scores exist between two or more groups – To determine if scores from one assignment predict scores of another assignment
13
Section III components Type of scoring mechanism used – Scoring guide, rubric, Scantron, professional judgment Attachment of scoring tool highlighting what is being assessed – Example: Rubric Participation – Faculty, Faculty committee, Program assessment committee, Deans, Institutional Research & Effectiveness – Goal: To keep track and demonstrate program participation
14
Section IV: Results Name of assignment Analytical approach – Should align with assessment goal! – To determine how many students are achieving at a specific level/score: Frequency distribution – To determine if differences in scores exist between two or more groups: chi-square, t- test or ANOVA – To determine if scores from one assignment predict scores of another assignment: Regression Sample size – Number of students assessed Statistical results – Frequency table – Central tendency – Standard deviation – Test statistic – Degrees of freedom – p value
15
Section V: Discussion & Implications Name of assignment Restate assignment goal Was the goal reached (Yes/No)? How do the results relate back to the ILO? – Narrative How are the results being used? – Narrative
16
Example Scenario: Following a discussion between faculty, Curriculum Committee, the Program Assessment Committee and the Dean, it was decided Evidence-Based Practice will be assessed using 4 th year preceptor evaluations. Question: What do we need to assess this assignment?
17
Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Things to consider: – Which PLO does this assignment address? – How is the assignment graded? – Who has the data? – What is/are the assessment goals? Standards of success – How do we analyze the data?
18
Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Assignment: The preceptor evaluation of students occurs during various time points within the 4 th year rotations. For the purpose of assessment, the program has decided to use the students’ last preceptor evaluation. The preceptor is asked to indicate using a Yes/No format if a student has been observed demonstrating a list of certain skills or has been observed displaying certain knowledge elements; there are 20 total items in the evaluation form. The data is sent directly to the 4 th year Director. To assess Evidence-Based Practice, a single item within the checklist is used: The student displays evidence-based practice.
19
Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Assessment Goal: 90% of students will demonstrate evidence-based practice skills. Why did we come up with 90%? – For grading, students need to achieve a score of 70% or higher, and each evaluation of “Yes” = 1 point, thus 14 points out of 20 is required to pass. – It is possible for all students to score 0 on the EBP item. – For assessment purposes, we are striving for 90% of students to display EBP skills in their last rotation within the curriculum. Remember signature assignment approach
20
Example: Data of 4 th year preceptor evaluations to assess Evidence-Based Practice EPB Score: 0 = no, 1 =yes Gender: 1 = male, 2 =female StudentEBP ScoreGender 1101 1201 1311 1402 1512 1601 1702 1801 1912 2012 StudentEBP ScoreGender 112 212 311 401 512 611 701 811 911 1001
21
Example Name of assignment4 th year preceptor evaluation Type of assessment (Direct; Indirect)Direct Provide a full description of the assignment.Preceptors indicate using a Yes/No format if students are observed demonstrating a list of certain skills or display certain knowledge elements; there are 20 total items in the evaluation form Which PLO(s) from the list in Section II (above) will this assignment assess? (Please list) PLO 2 Please state the assessment goal(s) for the assignment. What is the quantifiable standard(s) of success for this assignment? 90% of students will demonstrate evidence- based practice skills When does the assignment take place in the curriculum? (Year in program, Semester) This is the very last preceptor evaluation during the 4 th year Spring semester Type of scoring mechanism usedYes/No scoring guide for the item: The student displays evidence-based practice ParticipantsFaculty, Curriculum Committee, Assessment Committee and Dean selected assignment; 4 th year preceptors evaluated students; 4 th year program director collected data; Assessment Committee analyzed data
22
Example: Results Name of Assignment4 th year preceptor evaluation Analytical ApproachFrequency distribution Sample SizeN=20 Statistical Result FrequencyPercent No945.0% Yes1155.0% Total20100.0%
23
Example: Discussion & Implications Please restate the assessment goal(s). Was the goal reached? (Yes/No) How do the results relate back to the ILO? How are the findings being used? Assignment 1: 4 th year preceptor evaluation 90% of students will demonstrate evidence-based practice skills No; Only 55% of students demonstrated evidence-based practice skills. Only a slight majority of students demonstrate evidence-based practice skills during the final phase of their education within the curriculum. The program is determining 1. If preceptors know what to look for when evaluating students, 2. If there are predictors to student success for this assignment, 3. If previous 4 th year evaluations lead to a different conclusion, 4. Rigor?
24
GROUP WORK TIME!!!
25
Timeline Timeline for Programs Distribute TemplateApril 3, 2013 Section I: Progress Report Section II: Institutional Learning Outcome & Program Learning Outcome Alignment Section III: Methodology, Assessment Goals, & Participation May 3, 2013 Section IV: ResultsJune 7, 2013 Assessment Report DueJuly 31, 2013
26
Questions? Concerns?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.