Meredith Davison Mike Roscoe Program Directors 101 Pando™ Workshop Program Evaluation Meredith Davison Mike Roscoe
Objectives Differentiate between student outcomes assessment and program evaluation Describe the key components of an effective assessment or evaluation plan Identify instruments for efficient data collection Develop and implement an effective program evaluation plan Define the components of a needs assessment Develop and implement an effective needs assessment Define the elements of an ongoing self-study process Define and implement an effective ongoing self-study process Discuss data driven program and curriculum changes
Program Evaluation CRITICAL to the success of PA Programs Effective plan starts with well-developed strategy for Data Collection Analysis Action Evaluation (assessment) is one of the more difficult concepts program directors and faculty face
Three Components Three major components of any assessment plan 1) Data Collection 2) Data Analysis 3) Action Plan DYNAMIC process – NOT a static event that is a separate/distinct process
Discussions about assessment plans often start with data collection… BUT, you have to know where you are going before you build a road
Triangulation Information from more than one source Allows for greater perspective more meaningful analysis = better action plan
Information for your assessment plan must be organic to YOUR program! Mission Statement Goals Competencies Outcomes
Program Evaluation Answers the question – Is everything that happens working to the extend “we” would like it to work Who is “we”? Framework Relevance Measurability Improvability
Evaluation Purposes Program Improvement Data Sharing with Partners Reporting and Accountability
Definitions Program Specific Items Student Specific Items Mission Statement Goals Objectives (Programmatic) Student Specific Items Competencies Learning outcomes Instructional objectives
Essential Elements/Tools Timeline assessment map(s) Goals/Program items Outcomes/student items Who/what/when/responsible “Problem” document (needs assessment/Self-study) Dynamic Accessible to faculty/committee/PD Strategy to report and disseminate results
Data Source and Methods Timelines for Data Collection Program Area Including Person Responsible Data Source and Methods Timelines for Data Collection Critical Indicators Date Assessed: Met? Not Met? Mission, Philosophy, Goals, and Expected Outcomes Philosophy, Principles and Values of the Program Program Director Written curriculum plan Spring and Summer faculty retreat Philosophy and values of the program are congruent with the University’s mission and philosophy. Indicators: evidence of contemporary health policy related to disparities, equity and access are embedded throughout curriculum Program philosophy, goals and mission Philosophy and values are consistent with professional core values Program Faculty University and school mission January external advisory committee meeting Graduate outcomes reflect the philosophy, principles and values of the program. Indicators: 5 year rates of graduates employed in underserved settings and/or community based practice Student outcomes Philosophy, principles and values of the program are consistent with needs and values of the local professional community. Indicators: # of faculty active in local professional organizations, community boards, and DC Board of Health committees. Advisory Committee Recommend having: Assessment timeline document Problem document… PAEA Copyright 2016
Example of what a timeline for goals assessment (colors are collection, analysis and action might look like Example of what a “problem” document might look like
Developing an Assessment Plan (10 steps) Step 1: Determine WHAT is to be measured Based on mission and goals (think broad) What are some examples? Step 2: Determine HOW to measure the items selected Step 3: Determine if measures will be direct or indirect What is the difference? Examples?
Step 4: Determine data to be collected for each instrument This is specific – think objectives/outcomes Examples? Step 5: Determine benchmarks Reason for threshold?
Step 6: Determine/Describe how the data will be analyzed Examples? Step 7: Determine where data will be kept and disseminated Step 8: Describe/Determine how changes (Action) will occur Who is responsible, how recorded, assessment
Step 10: What are the barrier to implementing the action plan? Step 9: Check the plan? Does the plan assist us in achieving our mission and goals? Step 10: What are the barrier to implementing the action plan? Stakeholders been considered? Culture, politics Buy-in Impact
Common Mistakes Reporting Numbers = Analysis Meeting Goals = Good Outcomes (goals may be flawed) Outcomes Assessment = Program/Process Evaluation Side Effects (Impacts) = Negative Outcome Failure to Triangulate (> 1 data source) Failure to Evaluate the Evaluation Measures PAEA Copyright 2016 20
Failure to Evaluate the Process Failure to do Formative & Summative Statistical Significance Practical Significance Summary without Recommendations Bias Towards Favorable (or Unfavorable) Findings PAEA Copyright 2016 21
Not defining benchmarks (arbitrary) Lack of transparency Not including stakeholders Analysis without a follow-up action plan PAEA Copyright 2016 22
Helpful Hints Set realistic expectations. Start with what you have. Set realistic expectations. Sampling is ok for many assessments. Stagger for realistic workload. Set up tracking mechanisms for longitudinal analysis. Write definitions (e.g. remediation vs. retesting). Use results to make evaluation relevant.
Helpful Hints Consider external issues. Triangulate. 360 evaluation. Make decisions that are data driven and mission driven. Consider external issues. Triangulate. 360 evaluation. Look at trends. Overall pattern of change in an indicator over time…this is the unit of measure!
Helpful Hints Assessment does not have to be complex Make assessment “organic” – it is part of learning process Not just summative Find your help Institution PAEA Community
Acknowledgements Portions of this lecture were adapted from information by: Robert Philpot Ruth Ballweg