Download presentation
Presentation is loading. Please wait.
Published byEverett Garrett Modified over 8 years ago
1
1
2
OUTCOMES ASSESSMENT WORKSHOP May 16, 2016 2
3
JRCERT Mission 3 The JRCERT promotes excellence in education and elevates quality and safety of patient care through the accreditation of educational programs in radiography, radiation therapy, magnetic resonance, and medical dosimetry.
4
Program Statistics (May 2016) 4 Radiography 609 Radiation Therapy 73 Magnetic Resonance 10 Medical Dosimetry 19
5
2016 Year to Date Accreditation Actions 5 Total Considerations - 111 Interim Reports - 12 Initial - 4 Progress Reports & Substantive Changes - 44 Continuing - 50 Other - 1
6
2016 Year to Date Accreditation Actions 6 8 Year – 38 Probation –3 5 Year – 4 2 Year –1 3 Year – 4 Involuntary Withdraw – 1
7
7 “Educational values should drive not only what we choose to assess but also how we do so. Where questions about educational mission and values are skipped over, assessment threatens to be an exercise in measuring what is easy, rather than a process of improving what we really care about.” New Leadership Alliance, 2012
8
What is Assessment? 8 A process that provides information to participants, allowing clear evaluation of the process, the ability to understand the overall quality of the process and the opportunity to identify areas for improvement. (New Leadership Alliance, 2012)
9
Assessment should reflect an understanding of student learning over time so as to reveal change, growth, and increasing degrees of integration. This approach aims for a more accurate picture of learning and therefore a firmer base for improving our students educational experience 9
10
What is Student Learning Outcomes Assessment? 10 The ongoing process of 1. Establishing clear, measurable, expected SLOs 2. Systematically gathering, analyzing, and interpreting evidence to determine how well students’ learning matches expectations 3. Using the resulting information to understand and improve student learning 4. Reporting on processes and results
11
Assessment Involves: 11 Making your expectations explicit and public Using the resulting information to document, explain, and improve performance
12
Goal of Assessment ? 12 Information-based decision making “The end of assessment is action” Do not attempt to achieve the perfect research design… gather enough data to provide a reasonable basis for action. Wolvoord (2010)
13
Pitfalls of Assessment 13 Compliance with external demands Gathering data no one will use Making the process too complicated
14
Course Grades 14 Course grade cannot pinpoint concepts that students have or have not mastered Grading Criteria Attendance, Participation, Bonus points Inter-rater reliability or vague grading standards Not holistic Do grades have a place in an Assessment program?
15
Curriculum Map 15 CoursesStudent Learning Outcomes SLO 1SLO 2SLO 3SLO 4 RAD 150 RAD 153III RAD 154RII RAD 232RRRR RAD 234RR RAD 250MMM&AM RAD255M&A A “I” = introduce “R” = reinforce, practice “M” = mastery “A” = assessed for program assessment 0 = no emphasis 1 = minor emphasis 2 = moderate emphasis 3 = significant emphasis
16
Types of Assessment Student LearningProgram Effectiveness 16 What students will do or achieve Knowledge Skills Attitudes What the program will do or achieve Certification Pass Rate Job Placement Rate Program Completion Rate Graduate Satisfaction Employer Satisfaction
17
Types of Assessment Formative AssessmentSummative Assessment 17 Gathering of information during the progression of a program. Allows for student improvement prior to program completion. Gathering of information at the conclusion of a program.
18
18
19
Mission Statement 19 Our program is an integral part of the School of Allied Health Professions and shares its values. The program serves as a national leader in the education of students in the radiation sciences and provides learning opportunities that are innovative and educationally sound. In addition to exhibiting technical competence and the judicious use of ionizing radiation, graduates provide high quality patient care and leadership in their respective area of professional practice. Consideration is given to the effective use of unique resources and facilities. Strong linkages with clinical affiliates and their staff are vital to our success. Faculty and staff work in a cooperative spirit in an environment conducive to inquisitiveness and independent learning to help a diverse student body develop to its fullest potential. The faculty is committed to the concept of lifelong learning and promotes standards of clinical practice that will serve students throughout their professional careers. Mission Statement
20
20 The mission of our program is to produce competent entry-level radiation therapists. Mission Statement
21
Goals 21 broad statements of student achievement that are consistent with the mission of the program should address all learners and reflect clinical competence, critical thinking, communication skills, and professionalism Goals
22
Goals should not: 22 Contain assessment tools Contain increases in achievement Contain program achievements Goals
23
Goals ? 23 The program will prepare graduates to function as entry- level ___. The faculty will assure that the JRCERT accreditation requirements are followed. Students will accurately evaluate images for diagnostic quality. 85% of students will practice age-appropriate patient care on the mock patient care practicum. Goals
24
24
25
STUDENT LEARNING OUTCOMES
26
Student Learning Outcomes 26 Specific Measureable Attainable Realistic Targeted
27
Student Learning Outcomes 27 Students will ______ ________. action verb something SLOs
28
28
29
29 KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Lower division course outcomes
30
30 KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Upper division Course / Program outcomes outcomes
31
Student Learning Outcomes? 31 Students will be clinically competent. Students will complete 10 competencies with a grade ≥75% in RAD 227. Graduates will be prepared to evaluate and interpret images for proper evaluation criteria and quality. Students will demonstrate ability to operate tube locks. SLOs
32
Assessment Measurements 32 The most important criterion when selecting an assessment method is whether it will provide useful information - information that indicates whether students are learning and developing in ways faculty have agreed are important. (Palomba & Banta 2000)
33
Types of Assessment Measurements Direct Assessment Measurements Indirect Assessment Measurements 33 Demonstrate learning Performance learning allows students to demonstrate their skills through activities Provides reflection about learning
34
Measurements 34 Rubrics Unit or Final Exams Capstone Courses Portfolios Case Studies Embedded Questions Surveys (Graduate, Employer) Self-Evaluations Exit Interviews Focus Groups Reflective Essays DirectIndirect
35
Change Your Focus on Assessment 35 From (Unconsciously): Making the program look good on paper.
36
TO : How can I diagnose learning problems so that I can improve student learning. 36
37
Goal of Assessment 37 You cannot determine how to improve the program until you know how well the students have learned.
38
What should measurement tools do for you? 38 MUST measure the outcome Should represent your students’ achievements as accurately as possible Should assess not only whether the students are learning, but how well
39
Benchmarks 39 a point of reference from which measurements may be made something that serves as a standard by which others may be measured or judged
40
Benchmarks 40 Standard of Performance Realistic yet Attainable External or Internal No Double Quantifiers (Qualifiers) Rating Scale
41
Benchmarks (Examples) 41 ToolBenchmarkTimeframe Capstone Course – Final Portfolio ≥ 94.5 pts (100 scale) 5th Semester Clinical Evaluation Form (Section 2) ≥4.0 (5.0 scale) 5th Semester Debate Rubric≥ 31.5 points (35 possible points) 4th Semester
42
42
43
Group Exercise 1 and Assessment Committee Work 43
44
Collection Of Data And Data Analysis
45
Collect and Trend the Data 45 Report the actual data On assessment plan On separate document Should facilitate comparison Comparison of cohorts Comparison of students attending certain clinical setting Show dates
46
Data Analysis 46 What do the data say about your students’ mastery of subject matter, of research skills, or of writing and speaking? What do the data say about your students’ preparation for taking the next career step? Do you see areas where performance is okay, but not outstanding, and where you’d like to see a higher level of performance? UMass-Amherst, OAPA: http://www.umass.edu/oapa/oapa/publications/http://www.umass.edu/oapa/oapa/publications/
47
How can assessment data be used? Primary UsesSecondary Uses 47 Curriculum Review Requests to Curriculum Committee Accreditation Reports and Reviews Recruiting Alumni Newsletter Other publications Grants and other Funding UMass-Amherst, OAPA
48
Data Analysis 48 Identify benchmarks met Sustained effort Monitoring Evaluate benchmarks Identify benchmarks not met Targets for improvement Study the problem before trying to solve it!! Evaluate benchmark Identify 3 years of data (trend)
49
49 OutcomeBenchmarkResultsAnalysis/Action Students will demonstrate radiation protection. 85% of students will average a score of ≥ 5.0 (6.0 scale) 100% of students scored 5 or better. Benchmark met. Students will select appropriate technical factors. 75% of students will average a score of 85% or better 100% of students scored 85% or better.Continue to monitor Employers will find our graduates as proficient in radiation protection and safety. 80% of employer surveys will rate grads as Above Average or Excellent 100% of employers rate our grads as being Above Average or Excellent in proficiency of radiation protection skills. No Action Needed. Data Collection and Analysis?
50
50 OutcomeBenchmarkResultsAnalysis/Action Students will demonstrate radiation protection. ≥ 5.0 (6.0 scale)5.28 Benchmark met. For the past 2 years this result has increased (07/08: 4.90; 08/09: 5.15). This may be attributed to an increased emphasis of rad. protection throughout this semester. Graduates will manipulate the ‘typical’ examination protocol to meet the needs of a trauma patient ≥4.0 (5.0 scale)3.40 Benchmark not met. This result continually improves with each cohort (07/08: 3.25; 08/09: 3.33). The increased amount of lab time throughout the curriculum could be attributed to an increase in this result. Continue to monitor. 2009 – 2010 Data Collection and Analysis Example
51
Analysis Group Exercise 51
52
NOW WHAT ?
53
Assessment works best… 53 When it is ongoing.
54
Ongoing Assessment 54 is cumulative is fostered when assessment involves a linked series of activities undertaken over time may involve tracking progress of individuals or cohorts is done in the spirit of continuous improvement
55
Closing the Cycle 55 The process of drawing conclusions should be open to all those who are likely to be affected by the results – the communities of interest. Analysis of the assessment data needs to be shared and formally documented. For example, meeting minutes from Assessment or Advisory Committee.
56
Evaluation of the Assessment Plan 56 Evaluate the assessment plan itself to assure that assessment measures are adequate. Evaluation should assure that assessment is effective in measuring student learning outcomes. Document with meeting minutes. Objective 5.5
57
Answer These Questions 57 Is our mission statement still applicable for what the program is trying to achieve? Are we measuring valuable outcomes that are indicative of the graduate we are trying to produce? Do we like the plan? Does the plan provide us with the data that we are seeking? Are the student learning outcomes still applicable? Are the SLOs measurable?
58
Answer These Questions 58 “Do we want to use new tools to collect data for the SLO’s?” “Are our benchmarks appropriate?” “Do our benchmarks need adjustment?” “Are the appropriate personnel collecting the data?” “Are the data collection timeframes appropriate?” Make recommendations based on the answers
59
Keeping Your Documentation each For each year: 1.Copy of Assessment Plan 2.Actual tools for each one identified in plan – Do calculations on this tool. 3.Example of each tool (blank). 4.Meeting minutes that document analysis and sharing of the SLO and PED data. 5.Documentation of examples of changes that were implemented as a result of data gleaned from assessment process. 6.Meeting minutes documenting that the assessment plan has been evaluated to assure that measures are adequate and that the process is effective in measuring SLOs. 59
60
JRCERT Contact Information 142 mail@jrcert.orgmail@jrcert.org www.jrcert.orgwww.jrcert.org 20 North Wacker Drive, Suite 2850 Chicago, IL 60606-3182 (312) 704-5300
61
THANK YOU!! 61 for supporting excellence in education and quality patient care through programmatic accreditation.
62
References and Bibliography 62 Allen, M.J.(2004). Assessing academic programs in higher education. Bolton, MA: Anker Publishing Company, Inc. New Leadership Alliance. (2012). Assuring Quality: An Institutional Self- Assessment Tool for Excellent Practice in Student Learning Outcomes Assessment. Washington DC: New Leadership Alliance for Student Learning and Accountability. Suskie, L.(2009). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass. Wonderlic Direct Assessment of Student Learning. (n.d.). You Tube. Retrieved May 6, 2014, from http://www.youtube.com/ watch?v=JjKs8hsZosc Walvoord, B. E. (2010). Assessment clear & simple: A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-Bass.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.