Developing Learning Assessment Plans and Outcomes Assessments: Challenges and Opportunities in Medical Education Research Robert A. DiTomasso, Ph.D., ABPP.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Institutional Effectiveness (ie) and Assessment
Using the New CAS Standards to Assess Your Transfer Student Programs and Services Janet Marling, Executive Director National Institute for the Study of.
Student Learning Outcomes Curriculum Change Request Academic Council Presentation Gary Howard Rosemary Hays-Thomas October 22, 2004.
An Assessment Primer Fall 2007 Click here to begin.
STRATEGIC MANAGEMENT 1
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
FLCC knows a lot about assessment – J will send examples
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Assessment Workshop SUNY Oneonta May 23, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
Creating a Culture of Performance: The Role for Performance Appraisal in Strengthening Kazakhstan’s Civil Service Rex L. Facer II Associate Professor of.
Performance Appraisal Notes & Concept Main features Applications of results of appraisal Potential benefits/advantages Potential complications.
Using formative assessment. Aims of the session This session is intended to help us to consider: the reasons for assessment; the differences between formative.
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Michigan Department of Education Office of Education Improvement and Innovation One Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
Program Review In Student Affairs Office of the Vice President Division of Student Affairs Virginia Tech
Copyright ©2015 Pearson Education, Inc.
Curriculum development and curriculum assessment TEMPUS: Second consortium meeting, Koblanz Landau, Germany, March 2013 Dr. Roxana Reichman Working team.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
Institutional Effectiveness Building on MUSC Excellence.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
17/9/2009 Nakato Ruth Chapter one Introduction and review of strategic management.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Student Support Services Standard II B & C. II.B. The institution recruits and admits diverse students who are able to benefit from its programs, consistent.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
© Pearson Education Limited 2015
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Accomplishments:  We determined lab and course needs: 8 sections of 18 students each; 36 students per lecture section.  380 students/year  Most labs.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Conceptual Framework Presentation, 2006, Slide 1 The Conceptual Framework for Programs that Prepare Professionals Who Work in Schools What - Why - and.
Graduate School of Education Leading, Learning, Life Changing Emerging Trends in K-12 Education in Oregon Patrick Burk, PH.D. Educational Leadership and.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
Student Learning Objectives (SLO) Resources for Science 1.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Planning Definitions 9/25/2015. Components of a Strategic Plan Hinton, K.E. (2012). A Practical Guide to Strategic Planning in Higher Education. Society.
Middle School Social Studies September 19, 2007 Department Meeting.
Diamond Model A systems oriented Instructional Design Model based on the design generated from Designing and Assessing Courses and Curricula: A Practical.
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Gordon State College Office of Institutional Effectiveness Faculty Meeting August 5, 2015.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Academic Excellence Framework How can we meet the complex learning needs of students in a standards based environment? In brief, how can we achieve.
National Lutheran School Accreditation encourages, assists, and recognizes schools that provide quality Christian education and engage in continuous improvement.
A Commitment to Continuous Improvement in Teaching and Learning Michaela Rome, Ph.D. NYU Assistant Vice Provost for Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Nuts and Bolts: Functional Variations of Assessment and Evaluation Barbara Hornum, PhD Director, Drexel Center for Academic Excellence Associate Professor,
Then Now  Teaching as Art  Teachers and Teaching  Great teachers are born  How did I do?  Scholarship informs Teaching  Culture of Unexamined assumptions.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. BECOMING A SCHOLAR IN NURSING EDUCATION – Chapter 16 –
Presented by Anne C. Adams, MSW (919) and Bea Sweet
The assessment process For Administrative units
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Are Your Educational Programs Learning-Centered? Can You Measure This?
People Lead: This is the visual representation of our model. This model supports and reinforces our definition of leadership - achieving results, with.
Performance and Quality Improvement
The Program Evaluation Committee and the Annual Program Evaluation
NON-ACADEMIC ASSESSMENT REPORTING FY’17
Your Roadmap to Success
Presentation transcript:

Developing Learning Assessment Plans and Outcomes Assessments: Challenges and Opportunities in Medical Education Research Robert A. DiTomasso, Ph.D., ABPP Professor & Chairman Department of Psychology Director of Institutional Outcomes Assessment PCOM

Three Critical Concepts Learning-a relatively permanent change in behavior that occurs as a result of the experience of a student and is not due to other causes Assessment-a formalized, systematic process of determining the extent to which learning has occurred Outcome-reliable and valid evidence that learning has occurred

Use of Institutional Research Evidence of accomplishing mission Continuing self-assessment process Connect strategic plan to learning outcomes assessment Assessing student achievement, program effectiveness, and opportunities for improvement Measure clinical competencies

Accreditation Standards Outcomes=the results of the COM’s efforts in attaining its missions and goals

Learning Assessment Plan An accurate, practical, comprehensive, detailed, efficient, effective, valid, reliable, assessment plan or scheme specifically designed to evaluate the nature and extent to which student learning is being achieved

Learning Assessment Plan Delineates the relationship between institutional, programmatic, and course level goals and the expected student learning outcomes Documents the congruence between the aims of an educational organization and its administration, programs, faculty, students, and other consumers

Learning Assessment Plan Identifies –learners –teachers –sources of learning –types of learning experiences –nature-timing-sequence-organization of these learning experiences –formative and summative learning outcome assessments

Learning Assessment Plan Unique plan for the DO program Addresses what is absolutely most important to know about outcomes Easy to implement and manage Easy to understand and explain Non-threatening Well organized and methodical

Learning Assessment Plan Time sensitive Feedback-based Utilizes multiple assessments, modalities, informants, time points Relies on institutional commitment

10 Important Reasons For Developing A Learning Assessment Plan 1. Failing to plan is planning to fail 2. It’s better to know than assume 3. Absence of evidence is evidence of absence in this case 4. If you don’t ask the important questions, someone else surely will 5. The best defense is a good offense

10 Important Reasons 6. Only the strong survive 7. Some questions may never be truly answered without resorting to data 8. To ask is to know 9. There is strength in numbers 10. Information is power

Functions Served By LAPs Linking Informational Directive Pinpointing Reinforcement Mapping

Advantages of LAPs Relates numerous processes to multiple critical outcomes Thread that connects or binds diverse educational goals and activities Guides the educational improvement process

Advantages of LAPs Relates –what needs to be learned, –how it is to be learned –how we know it’s learned/not learned –what we can do to make it better

Disadvantages of LAPs You may find out what you don’t want to know (but need to know) May demolish some cherished beliefs or practices

Important Considerations For Success Outcomes assessment must become part of the culture of the institution Outcomes assessment is not done for accreditors and then forgotten about until the next review Outcomes assessment is a critical pathway along the road to excellence

Important Considerations For Success Outcomes Assessment is necessary and sufficient for –documenting educational quality –maintaining educational quality –improving educational quality

Components of LAPs 1. Delineation of core competencies students are expected to acquire 2. Identification of all course and course related educational experiences directly related to each competency 3. Identification of all formative and summative assessment measures

Components of LAPs 4. Identification of the point in program when assessment is conducted 5. Specification of exact outcome information obtained 6. Specification of feedback loops

10 Step Process 1. Obtain institutional support/ Involve key players in process 2. Delineate the specific core competencies you expect your students to acquire 3. Identify all courses and related educational experiences directly related to each competency

10 Step Process 4. Identify the all formative and summative assessment measures

10 Step Process 5. Identify point in program when assessment is conducted 6. Ascertain that data to be obtained will reliably and validly measure the competency 7. Assume a collaborative focus with an eye toward excellence

10 Step Process 8. Design a system for data collection that is efficient, minimally intrusive, and embedded within the program 9. Provide assistance with data entry, analysis, interpretation, and reporting 10. Develop a system for feeding information back to each program, faculty, administration, committees, and students

THE END