Presentation is loading. Please wait.

Presentation is loading. Please wait.

1. 2 November 9, 2015 Chicago, Illinois The JRCERT promotes excellence in education and elevates quality and safety of patient care through the accreditation.

Similar presentations


Presentation on theme: "1. 2 November 9, 2015 Chicago, Illinois The JRCERT promotes excellence in education and elevates quality and safety of patient care through the accreditation."— Presentation transcript:

1 1

2 2 November 9, 2015 Chicago, Illinois

3 The JRCERT promotes excellence in education and elevates quality and safety of patient care through the accreditation of educational programs in radiography, radiation therapy, magnetic resonance, and medical dosimetry.

4 Laura S. Aaron, Ph.D., R.T.(R)(M)(QM), FASRT Chair Stephanie Eatmon, Ed.D., R.T.(R)(T), FASRT 1 st Vice Chair Tricia Leggett, D.H.Ed., R.T.(R),(QM) 2 nd Vice Chair Darcy Wolfman, M.D. Secretary/Treasurer

5 5 Laura Borghardt, M.S., CMD Susan R. Hatfield, Ph.D. Bette A. Schans, Ph.D., R.T.(R) Jason L. Scott, M.B.A., R.T.(R)(MR), CRA, FAHRALoraine D. Zelna, M.S., R.T.(R)(MR)

6 Leslie F. Winter CEO Jay Hicks Executive Associate Director Traci Lang Assistant Director

7 Tom Brown Accreditation Specialist Jacqueline Kralik Accreditation Specialist Brian Leonard Accreditation Specialist

8 8 Radiography 613 Radiation Therapy 74 Magnetic Resonance 10 Medical Dosimetry 17

9 9 Total Considerations - 378 Interim Reports - 151 Initial -9 Progress Reports - 29 Continuing - 80 Other – 109

10 10 8 Year – 59 Probation – 5 5 Year – 13 2 Year – 2 3 Year – 6 Involuntary Withdraw – 3

11

12  “Educational values should drive not only what we choose to assess but also how we do so. Where questions about educational mission and values are skipped over, assessment threatens to be an exercise in measuring what is easy, rather than a process of improving what we really care about.” 12 New Leadership Alliance, 2012

13  A process that provides information to participants, allowing clear evaluation of the process, the ability to understand the overall quality of the process and the opportunity to identify areas for improvement. (New Leadership Alliance, 2012) 13

14 14

15  The ongoing process of 1.Establishing clear, measurable, expected SLOs 2.Systematically gathering, analyzing, and interpreting evidence to determine how well students’ learning matches expectations 3.Using the resulting information to understand and improve student learning 4.Reporting on processes and results 15

16  Making your expectations explicit and public  Using the resulting information to document, explain, and improve performance 16

17  Information-based decision making  “The end of assessment is action”  Do not attempt to achieve the perfect research design… gather enough data to provide a reasonable basis for action. Wolvoord (2010) 17

18  Compliance with external demands  Gathering data no one will use  Making the process too complicated 18

19  Course grade cannot pinpoint concepts that students have or have not mastered  Grading Criteria ◦ Attendance, Participation, Bonus points  Inter-rater reliability or vague grading standards  Not holistic  Do grades have a place in an Assessment program? 19

20 CoursesStudent Learning Outcomes SLO 1SLO 2SLO 3SLO 4 RAD 150 RAD 153III RAD 154RII RAD 232RRRR RAD 234RR RAD 250MMM&AM RAD255M&A A 20 “I” = introduce “R” = reinforce, practice “M” = mastery “A” = assessed for program assessment 0 = no emphasis 1 = minor emphasis 2 = moderate emphasis 3 = significant emphasis

21 Student Learning Program Effectiveness What students will do or achieve  Knowledge  Skills  Attitudes What the program will do or achieve  Certification Pass Rate  Job Placement Rate  Program Completion Rate  Graduate Satisfaction  Employer Satisfaction 21

22 Formative Assessment Summative Assessment  Gathering of information during the progression of a program.  Allows for student improvement prior to program completion.  Gathering of information at the conclusion of a program. 22

23 23

24 Our program is an integral part of the School of Allied Health Professions and shares its values. The program serves as a national leader in the education of students in the radiation sciences and provides learning opportunities that are innovative and educationally sound. In addition to exhibiting technical competence and the judicious use of ionizing radiation, graduates provide high quality patient care and leadership in their respective area of professional practice. Consideration is given to the effective use of unique resources and facilities. Strong linkages with clinical affiliates and their staff are vital to our success. Faculty and staff work in a cooperative spirit in an environment conducive to inquisitiveness and independent learning to help a diverse student body develop to its fullest potential. The faculty is committed to the concept of lifelong learning and promotes standards of clinical practice that will serve students throughout their professional careers. 24 Mission Statement

25 The mission of our program is to produce competent entry-level radiation therapists. 25 Mission Statement

26  broad statements of student achievement that are consistent with the mission of the program  should address all learners and reflect clinical competence, critical thinking, communication skills, and professionalism 26 Goals

27  Contain assessment tools  Contain increases in achievement  Contain program achievements 27 Goals

28  The program will prepare graduates to function as entry-level ___.  The faculty will assure that the JRCERT accreditation requirements are followed.  Students will accurately evaluate images for diagnostic quality.  85% of students will practice age-appropriate patient care on the mock patient care practicum. 28 Goals

29 29

30

31 Specific Measureable Attainable Realistic Targeted 31

32 32 Students will ______ ________. action verb something SLOs

33 33

34 34 KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Lower division course outcomes

35 35 KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Upper division Course / Program outcomes outcomes

36  Students will be clinically competent.  Students will complete 10 competencies with a grade ≥75% in RAD 227.  Graduates will be prepared to evaluate and interpret images for proper evaluation criteria and quality.  Students will demonstrate ability to operate tube locks. 36 SLOs

37 The most important criterion when selecting an assessment method is whether it will provide useful information - information that indicates whether students are learning and developing in ways faculty have agreed are important. (Palomba & Banta 2000) 37

38 Direct Assessment Measurements Indirect Assessment Measurements ◦ Demonstrate learning ◦ Performance learning allows students to demonstrate their skills through activities ◦ Provides reflection about learning 38

39 ◦ Rubrics ◦ Unit or Final Exams ◦ Capstone Courses ◦ Portfolios ◦ Case Studies ◦ Embedded Questions ◦ Surveys (Graduate, Employer) ◦ Self-Evaluations ◦ Exit Interviews ◦ Focus Groups ◦ Reflective Essays 39 DirectIndirect

40 40 From (Unconsciously): Making the program look good on paper.

41 41

42 42 You cannot determine how to improve the program until you know how well the students have learned.

43  MUST measure the outcome  Should represent your students’ achievements as accurately as possible  Should assess not only whether the students are learning,but how well 43

44  a point of reference from which measurements may be made  something that serves as a standard by which others may be measured or judged 44

45  Standard of Performance  Realistic yet Attainable  External or Internal  No Double Quantifiers (Qualifiers)  Rating Scale 45

46 46 ToolBenchmarkTimeframe Capstone Course – Final Portfolio ≥ 94.5 pts (100 scale) 5th Semester Clinical Evaluation Form (Section 2) ≥4.0 (5.0 scale) 5th Semester Debate Rubric≥ 31.5 points (35 possible points) 4th Semester

47 47

48 48

49

50  Report the actual data ◦ On assessment plan ◦ On separate document  Should facilitate comparison ◦ Comparison of cohorts ◦ Comparison of students attending certain clinical setting  Show dates 50

51  What do the data say about your students’ mastery of subject matter, of research skills, or of writing and speaking?  What do the data say about your students’ preparation for taking the next career step?  Do you see areas where performance is okay, but not outstanding, and where you’d like to see a higher level of performance? 51 UMass-Amherst, OAPA: http://www.umass.edu/oapa/oapa/publications/http://www.umass.edu/oapa/oapa/publications/

52 Primary Uses Secondary Uses  Curriculum Review  Requests to Curriculum Committee  Accreditation Reports and Reviews  Recruiting  Alumni Newsletter  Other publications  Grants and other Funding 52 UMass-Amherst, OAPA

53  Identify benchmarks met ◦ Sustained effort ◦ Monitoring ◦ Evaluate benchmarks  Identify benchmarks not met ◦ Targets for improvement ◦ Study the problem before trying to solve it!! ◦ Evaluate benchmark  Identify 3 years of data (trend) 53

54 54 OutcomeBenchmarkResultsAnalysis/Action Students will demonstrate radiation protection. 85% of students will average a score of ≥ 5.0 (6.0 scale) 100% of students scored 5 or better. Benchmark met. Students will select appropriate technical factors. 75% of students will average a score of 85% or better 100% of students scored 85% or better. Continue to monitor Employers will find our graduates as proficient in radiation protection and safety. 80% of employer surveys will rate grads as Above Average or Excellent 100% of employers rate our grads as being Above Average or Excellent in proficiency of radiation protection skills. No Action Needed. Data Collection and Analysis?

55 OutcomeBenchmarkResultsAnalysis/Action Students will demonstrate radiation protection. ≥ 5.0 (6.0 scale)5.28 Benchmark met. For the past 2 years this result has increased (07/08: 4.90; 08/09: 5.15). This may be attributed to an increased emphasis of rad. protection throughout this semester. Graduates will manipulate the ‘typical’ examination protocol to meet the needs of a trauma patient ≥4.0 (5.0 scale)3.40 Benchmark not met. This result continually improves with each cohort (07/08: 3.25; 08/09: 3.33). The increased amount of lab time throughout the curriculum could be attributed to an increase in this result. Continue to monitor. 55 2009 – 2010 Data Collection and Analysis Example

56 56

57

58  When it is ongoing. 58

59  is cumulative  is fostered when assessment involves a linked series of activities undertaken over time  may involve tracking progress of individuals or cohorts  is done in the spirit of continuous improvement 59

60  The process of drawing conclusions should be open to all those who are likely to be affected by the results – the communities of interest.  Analysis of the assessment data needs to be shared and formally documented. For example, meeting minutes from Assessment or Advisory Committee. 60

61  Evaluate the assessment plan itself to assure that assessment measures are adequate.  Evaluation should assure that assessment is effective in measuring student learning outcomes.  Document with meeting minutes. 61 Objective 5.5

62  Is our mission statement still applicable for what the program is trying to achieve?  Are we measuring valuable outcomes that are indicative of the graduate we are trying to produce?  Do we like the plan?  Does the plan provide us with the data that we are seeking?  Are the student learning outcomes still applicable?  Are the SLOs measurable? 62

63  “Do we want to use new tools to collect data for the SLO’s?”  “Are our benchmarks appropriate?”  “Do our benchmarks need adjustment?”  “Are the appropriate personnel collecting the data?”  “Are the data collection timeframes appropriate?” Make recommendations based on the answers 63

64 Keeping Your Documentation each For each year: 1.Copy of Assessment Plan 2.Actual tools for each one identified in plan – Do calculations on this tool. 3.Example of each tool (blank). 4.Meeting minutes that document analysis and sharing of the SLO and PED data. 5.Documentation of examples of changes that were implemented as a result of data gleaned from assessment process. 6.Meeting minutes documenting that the assessment plan has been evaluated to assure that measures are adequate and that the process is effective in measuring SLOs. 64

65 mail@jrcert.orgmail@jrcert.org www.jrcert.orgwww.jrcert.org 142 20 North Wacker Drive, Suite 2850 Chicago, IL 60606-3182 (312) 704-5300

66 66 for supporting excellence in education and quality patient care through programmatic accreditation.

67  Allen, M.J.(2004). Assessing academic programs in higher education. Bolton, MA: Anker Publishing Company, Inc.  New Leadership Alliance. (2012). Assuring Quality: An Institutional Self- Assessment Tool for Excellent Practice in Student Learning Outcomes Assessment. Washington DC: New Leadership Alliance for Student Learning and Accountability.  Suskie, L.(2009). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass.  Wonderlic Direct Assessment of Student Learning. (n.d.). You Tube. Retrieved May 6, 2014, from http://www.youtube.com/ watch?v=JjKs8hsZosc  Walvoord, B. E. (2010). Assessment clear & simple: A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-Bass. 67


Download ppt "1. 2 November 9, 2015 Chicago, Illinois The JRCERT promotes excellence in education and elevates quality and safety of patient care through the accreditation."

Similar presentations


Ads by Google