Quality Improvement in Osteopathic Medical Schools

Slides:



Advertisements
Similar presentations
Connecting Libraries to Organizational Mission: Using Assessment to Strengthen the Academic Health Sciences Librarys Institutional Role James Shedlock,
Advertisements

INSTITUTIONAL READINESS What does it mean to be ready to do a major course redesign? Is your institution ready?
1 Assessing Outcomes After Theyre Gone – Measuring Preparedness and Quality in Practice Presented by: Mary Pat Wohlford-Wessels, Ph.D. Vice President for.
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Using the New CAS Standards to Assess Your Transfer Student Programs and Services Janet Marling, Executive Director National Institute for the Study of.
Tuition and Financial Aid…FY 2014 Planning University Medical Student Council November 17, 2012 E XCELLENCE I NNOVATION S ERVICE 1.
1. NCAA Division III Financial Aid Reporting Program and Self-Assessment 2012.
Faculty Adequacy – Methods to Meet the Standard
Medical Education Grand Rounds Self-Study Overview Middle States Commission on Higher Education January 13, 2010.
SEM Planning Model.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
NCAA Division I Institutional Performance Program
First Choice Graduate Programs Initiating a First Choice Program Consultation and Review.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Andrew Howard Nichols, Ph.D. Senior Research Analyst The Pell Institute Student Financial.
CAA’s IBHE Program Review Presentation April 22, 2011.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
A member of the Minnesota State Colleges and Universities system, Bemidji State University is an affirmative action, equal opportunity employer and educator.
Central Virginia Community College Where your future begins.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Presentation Outline  Introductions  Why are “price sensitivity” and “value” important?  Strategic pricing & value enhancement framework  From research.
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration – Initiated in 2003 – Low-residency—one weekend per month (over.
Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets.
Measuring Dispositions Dr. Sallie Averitt Miller, Associate Dean Office for Assessment and Accreditation Columbus State University GaPSC Regional Assessment.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Focus on Learning: Student Outcomes Assessment and the Learning College.
Institutional Evaluation of medical faculties Prof. A. Сheminat Arkhangelsk 2012.
INSTITUTE FOR DISTANCE AND DISTRIBUTED LEARNING Virginia Tech A Holistic Approach to eLearning The Institute for Distance and Distributed Learning.
IPEDS Teaching Institute: An Overview May, 2006 Anthony R. Bichel, Ph.D.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
Cultural Competency in an Osteopathic Curriculum Presented by: Mary Pat Wohlford-Wessels, Ph.D. Vice President for Institutional Research and Effectiveness.
Primary Functions of Program Directors Leadership Curriculum Management and Coordination Coordinate Program Assessment Marketing, Recruitment and Admissions.
1 Strategic Master Plan 2015 Annebelle Nery, PhD Executive Dean of Institutional Effectiveness, Educational Services & Planning Daniel Martinez, PhD Director,
SHORTER COLLEGE Assessment Week Sponsored by the Office of Institutional Effectiveness and Assessment & the Division of Academic Affairs.
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
 Office of Student Affairs › All about students › Admission, tracking progression, and graduation › Facilitate student success › Administer the curricular.
The Longitudinal Student Assessment Project (LSAP)
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Kimberly B. Lis, M.Ed. University of St. Thomas Administrative Internship II Dr. Virginia Leiker.
Our Story: Our Story: The Story of One Student Affairs Division’s Quest to Improve Assessment Don Whalen, Coordinator of Assessment, Department of Residence.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
The COMparison report A COMPARISON OF OU - COM TO ALL COM s NATIONALLY Trends and Graphs Office of Institutional Assessment & Planning November 2010 Institutional.
October 20 – November 6, 2014 Alovidin Bakhovidinov Alina Batkayeva
30/10/2006 University Leaders Meeting 1 Student Assessment: A Mandatory Requirement For Accreditation Dr. Salwa El-Magoli Chair-Person National Quality.
Preliminary Legislative Recommendations to the 85th Texas Legislature October 2015.
Accreditation Council for Graduate Medical Education Milestones are Coming: A Conversation with the Family Medicine Milestones Committee May 2013.
Assessing the Operations at Your Institution: Purposes + Planning + Processes = Performance Arizona Assessment Conference Yavapai College, Prescott, Arizona.
SZABIST INSTITUTIONAL RESEARCH DEPARTMENT QUALITY ENHANCEMENT CELL.
Moneyball: The Art of Statistical Analysis in Health-Related Professions Bradley D. Marcum University of Pikeville Kentucky College of Osteopathic Medicine.
Conversation with the SLOA&C March 20, 2015 Professional Development Day SJCC Presenters: C. Cruz-Johnson, S. Datta, and R. Gamez Paving the Way for Student.
State University of New York at Buffalo Primary Care Master Educator Program David Newberger, M.D. Elie Akl, M.D., Ph.D. * Denise McGuigan, M.S. Ed. Andrew.
Course Substitutions & Course Waivers Proposed New Policy AGC 1 st Presentation September 08, 2015 Patti Trepkowski and Diane Patrick.
Committee VII: Faculty. Continuing challenges regarding previous “findings” : Enhancement in faculty career development growing out of recent improvement.
MSJC Accreditation Classified Professional Day – March 22,2017
Michael Kelly, Ed. D. John Gratto, Ed. D. Virginia Tech
New Program Director Workshop
New Program Director Workshop:
Programme Review Dhaya Naidoo Director: Quality Promotion
Evaluating Online Courses and the Challenges Associated
Sam Houston State University
Promotion on the Clinician Educator and Clinical Practice Tracks
Committee # VI: Medical Students: Student Services/Learning Environment.
Sam Houston State University
Fort Valley State University
Presentation transcript:

Quality Improvement in Osteopathic Medical Schools Presented by: Dr. Mary Pat Wohlford-Wessels Director of Academic Quality & Curricular Affairs Dr. Diane Hills Associate Dean Academic Affairs Dr. David Garloff Associate Dean Clinical Affairs – Site Development Des Moines University College of Osteopathic Medicine

Contents Introduction Definition of Quality Improvement Matrix of requirements DMU-COM structure DMU-COM process DMU-COM data sources Report Structure

Overview Virtually everyone in higher education is interested in improving the quality of education provided to students. In fact, to a certain degree, processes that support quality improvement must be in place to meet regional and specialty accreditation standards.

Presentation Rationale Improving medical education requires systematic processes that support the review and assessment of the work we do. This presentation will demonstrate how Des Moines University's College of Osteopathic Medicine has developed a comprehensive process that incorporates the work of existing committees, and the use of internal and external data sources to benchmark our work against best practice in Osteopathic Medical Education

Objectives At the completion of this educational session participants will be able to: Define Academic Quality Describe how an academic quality improvement can support internal program review and the requirements of accrediting agencies Describe how to utilize internal and external (AACOM, NCHEMS, AAMC) data sources to benchmark practice.

Definition of Academic Quality Dr. Steve Spangehl states: A quality or a high-performance organization as one that succeeds in satisfying its stakeholders' expectations by meeting or exceeding their needs. Quality becomes a journey, a search for better ways to understand the changing needs of an organization's stakeholders and for better ways to meet their needs. Since we can measure the performance of the various processes an organization uses to gauge and meet its stakeholders' needs, improvements are measurable -- although quality itself is not. The size and regularity of those improvements testify to an organization's quality culture. Used this way, quality ought always to be an adjective, never a noun. Quality describes an organization that behaves in certain ways -- it focuses upon processes, bases decisions on facts and measurements, looks at itself as an integrated system designed to achieve its ultimate mission and purposes. http://www.ncacasi.org/

Internal Assessment Early in the development of the Quality Initiative at DMU, leadership assessed the internal and external requirements related to quality and outcomes. A matrix was developed and used to guide activities. The objective was NOT to add another layer of activities, but rather to support and enhance existing processes.

Quality and Outcomes Matrix Required components AOA Standard NCA Standard DMU Outcomes Report Requirement Proposed DMU Program Evaluation Requirement Mission, goals, objectives Standard One Criterion 1 GIR 1,2, 3 & 4 Students outcomes must support goals and objectives Element I Element II Governance, administration and Finance Standard Two Criterion 2 GIR 5, 6 & 7 GIR 19, 20, 21 GIR 19, 20, 21 (finances) Element VI Academic Program Costs

DMU-COM Performance Improvement Process Inputs

DMU-COM Process COM Performance Improvement Committee meets monthly. Chair is a faculty member. The business of the group is directed by a Gantt chart of monthly activities that drive continuous assessment and supports the development of an annual report. The Committee Chair reports to the faculty at large monthly.

Gantt Chart of Committee Activities Performance Improvement Work Plan – 2005/2006 March 2005 Approve the Committee Policy and Procedure Approve committee membership and duties Review and provide input on 2005 report content April Review content obtained from the Retreat Review Faculty Development Survey results and make recommendations Review Organizational Profile Results and make recommendations May Review report of improvement activities related to the pre-clinical curriculum Review Dual Degree report Assess the use of technology in support of the curriculum June Evaluate the research report Evaluate the community service report Evaluate student costs and tuition Evaluate student perceptions of student services July Begin reviewing the early draft of the 2005 report Evaluate board score data Evaluate graduate feedback August Review enrollment development and admissions data Initial report approved to be sent to the associate deans and department chairs

Data Utilization

Improving Medical Education through Data Institutional Data AACOM NBOME Integrated Post secondary Education Data System (IPEDS) AAMC National Center for Higher Education Management Systems (NCHEMS)

Classified Data in Categories Enrollment Development Admissions Costs & Tuition Student Perceptions Pre-Clinical Curriculum Attrition Clinical Curriculum

Classified Data in Categories Board Scores Internship & Residency Research Scholarship Faculty Development Community Service

Data Use – Students Class CUM GPA Science Nonscience MCAT Verbal Physical Biology Writing 2007 3.50 3.44 3.52 8.21 8.15 8.72 P 2008 3.54 3.48 3.60 8.31 8.16 8.83 Q

Data Use

Data Use - Tuition Year Tuition Percent Increase 99/00 $22,950 00/01 $23,900 4% 01/02 $24,900 02/03 $25,475 2% 03/04 $26,350 3% 04/05 $28,000 6%

Data Use – Student Satisfaction Question 2003 2004 2005 2006 2005/2006 Quality of Campus Life 69% 71% 81% +10 Quality of Student Centeredness 57% 49% 75% 82% + 7 Quality of Academic Life 77% 67% 87% Quality of Administrative Services 58% 68% 85% +13 Quality of the Student Service Office 76% 92% + 5 Quality of Preclinical Education 70% 84% +14 Quality of Information About Rotations 40% 30% 88% +58 Opportunities for Feedback 65% 60% +11 Effectiveness of Faculty Advisor 41% 35% 52% +17 Quality of Financial Aid Service 61% +26 Availability of Computer Technology 66% 72% - 4 Effectiveness of Student Counseling 74% 83% 90% + 7 Quality of Student Health Service 56% +22

Data Use – Instructional Technology Number of Blackboard sites for DO students   School Year 2001 - 2002 2002 - 2003 2003 - 2004 2004 - 2005 DO students DO 08 18 DO 07 19 32 DO 06 8 20 6 DO 05 5 15 DO 04 2 DO 03 Total # of BB sites 12 25 44 56

Data Use – COMLEX I Difference in DMU-COM Scores and All Peers June 01 June 02 June 03 June 04 Anatomy -16 +17 -33 +10 Biochemistry +22 -2 -10 +21 Physiology +16 +14 +3 +30 Pharmacology +19 + -21 +15 Pathology -1 7 -30 +13 Microbiology -8 -27 +12

Data Use – COMLEX I National Mean DMU-COM Mean Difference (rounded) Physician Skills HP/DP 548.66 596.09 +47 Hist. & Phy. 531.37 541.18 +10 Diag. Tech. 572.53 676.38 +104 Management 519.29 539.30 +20 Science 504.72 521.32 +17 Topic Categories Osteo. P&P 518.03 549.03 +31 Gen. Osteo. 506.39 523.08

Data Use Sources of financial support 1992 – 2002 Reported by AACOM member institutions

Data Use - Research Government Private Gifts Institution Grants & Contracts A T STILL UNIVERSITY OF HEALTH SCIENCES $4,194,080 $2,922,386 DES MOINES UNIVERSITY-OSTEOPATHIC MEDICAL CENTER $1,908,530 $1,580,025 EDWARD VIA VIRGINIA COLLGE OF OSTEOPATHIC MEDICINE $121,277 $0 LAKE ERIE COLLEGE OF OSTEOPATHIC MEDICINE $1,011,447 $92,512 MIDWESTERN UNIVERSITY $1,423,902 $815,664 $134,597 $595,306 NEW YORK INSTITUTE OF TECHNOLOGY-OLD WESTBURY $1,734,769 $2,280,070 PHILADELPHIA COLLEGE OF OSTEOPATHIC MEDICINE $1,977,569 $77,002 PIKEVILLE COLLEGE $1,345,682 $4,123,065 TOURO COLLEGE $3,701,320 $1,407,414 UNIVERSITY OF HEALTH SCIENCES-COLLEGE OF OSTEOPATH $753,897 $1,851,958 UNIVERSITY OF NEW ENGLAND $3,897,588 $2,770,343 WEST VIRGINIA SCHOOL OF OSTEOPATHIC MEDICINE $268,193 WESTERN UNIVERSITY OF HEALTH SCIENCES $2,006,815 $1,835,709

Comparison of Results Using Two Different Analysis Techniques - I First, Second, Third Choice Reasons For Attending COMS N = 101 Reason First Choice Second Choice Third Choice Faculty/Administration 19 24 16 Student Body 14 23 13 Cost (tuition/living expenses) 1 5 8 Clinical rotations University facilities 2 4 3 Reputation of program/faculty 39 15 Recommendation by friend/family 10 Geographic location Received scholarship/grant Blank Responses

Comparison of Results Using Two Different Analysis Techniques - II The above data was recoded to be able to rank the reason(s) why students chose DMU. Responses were recoded to a Likert scale as follows: 4 = extremely important (3.50 – 4) 3= very important (2.50 – 3.49) 2 = important (1.50 – 2.49) 1 = not important (1 – 1.49) N Minimum Maximum Mean Std. Deviation Reputation of program and faculty 101 1.00 4.00 2.5941 1.29752 Faculty Administration 2.1980 1.17490 Student Body 1.9703 1.12655 Recommendation by friend/family 1.7129 1.08937 Geographic location 1.6337 1.01708 Cost 1.2079 .57126 Clinical Rotations 1.1782 .55491 University Facilities 1.1683 .58428 Received scholarship or grant 1.0693 .43029 Valid N (listwise) Reliability Coefficients N of Cases = 101.0 Alpha = .93 The utilization of the second technique provides additional data and helps target improvement activities. Tracking the gender and ethnicity of respondents will further assist COM in making decisions regarding improvement.

Performance Improvement Report Developed annually Distributed to program stakeholders 2004 report - represented years 1-4 Admission Pre-clinical Clinical Residency 2005 report – organized using Baldrige Education Criteria

Baldrige Education Criteria Leadership Strategic Planning Student, Stakeholder, Market Focus Measurement, Analysis and Knowledge Management Faculty and Staff Focus Process Management Research (not an official Baldrige Category) Performance Results http://www.quality.nist.gov/

Conclusion Performance improvement at DMU has been a journey beginning years ago with student outcomes assessment. The process has evolved into a comprehensive system that includes multiple stakeholders all focused on improving the quality of the organizational culture and ultimately the performance of our graduates.

Who to Contact If you have additional questions about the QI program and/or processes at DMU, please contact: Dr. Mary Pat Wohlford-Wessels Director, Academic Quality and Curricular Affairs 515 271-1636 Mary.Wohlford-Wessels@dmu.edu