Download presentation
Presentation is loading. Please wait.
1
Meredith Davison Mike Roscoe
Program Directors 101 Pando™ Workshop Program Evaluation Meredith Davison Mike Roscoe
2
Objectives Differentiate between student outcomes assessment and program evaluation Describe the key components of an effective assessment or evaluation plan Identify instruments for efficient data collection Develop and implement an effective program evaluation plan Define the components of a needs assessment Develop and implement an effective needs assessment Define the elements of an ongoing self-study process Define and implement an effective ongoing self-study process Discuss data driven program and curriculum changes
3
Program Evaluation CRITICAL to the success of PA Programs
Effective plan starts with well-developed strategy for Data Collection Analysis Action Evaluation (assessment) is one of the more difficult concepts program directors and faculty face
4
Three Components Three major components of any assessment plan
1) Data Collection 2) Data Analysis 3) Action Plan DYNAMIC process – NOT a static event that is a separate/distinct process
5
Discussions about assessment plans often start with data collection…
BUT, you have to know where you are going before you build a road
6
Triangulation Information from more than one source
Allows for greater perspective more meaningful analysis = better action plan
8
Information for your assessment plan must be organic to YOUR program!
Mission Statement Goals Competencies Outcomes
9
Program Evaluation Answers the question – Is everything that happens working to the extend “we” would like it to work Who is “we”? Framework Relevance Measurability Improvability
10
Evaluation Purposes Program Improvement Data Sharing with Partners
Reporting and Accountability
11
Definitions Program Specific Items Student Specific Items
Mission Statement Goals Objectives (Programmatic) Student Specific Items Competencies Learning outcomes Instructional objectives
13
Essential Elements/Tools
Timeline assessment map(s) Goals/Program items Outcomes/student items Who/what/when/responsible “Problem” document (needs assessment/Self-study) Dynamic Accessible to faculty/committee/PD Strategy to report and disseminate results
14
Data Source and Methods Timelines for Data Collection
Program Area Including Person Responsible Data Source and Methods Timelines for Data Collection Critical Indicators Date Assessed: Met? Not Met? Mission, Philosophy, Goals, and Expected Outcomes Philosophy, Principles and Values of the Program Program Director Written curriculum plan Spring and Summer faculty retreat Philosophy and values of the program are congruent with the University’s mission and philosophy. Indicators: evidence of contemporary health policy related to disparities, equity and access are embedded throughout curriculum Program philosophy, goals and mission Philosophy and values are consistent with professional core values Program Faculty University and school mission January external advisory committee meeting Graduate outcomes reflect the philosophy, principles and values of the program. Indicators: 5 year rates of graduates employed in underserved settings and/or community based practice Student outcomes Philosophy, principles and values of the program are consistent with needs and values of the local professional community. Indicators: # of faculty active in local professional organizations, community boards, and DC Board of Health committees. Advisory Committee Recommend having: Assessment timeline document Problem document… PAEA Copyright 2016
15
Example of what a timeline for goals assessment (colors are collection, analysis and action might look like Example of what a “problem” document might look like
16
Developing an Assessment Plan (10 steps)
Step 1: Determine WHAT is to be measured Based on mission and goals (think broad) What are some examples? Step 2: Determine HOW to measure the items selected Step 3: Determine if measures will be direct or indirect What is the difference? Examples?
17
Step 4: Determine data to be collected for each instrument
This is specific – think objectives/outcomes Examples? Step 5: Determine benchmarks Reason for threshold?
18
Step 6: Determine/Describe how the data will be analyzed
Examples? Step 7: Determine where data will be kept and disseminated Step 8: Describe/Determine how changes (Action) will occur Who is responsible, how recorded, assessment
19
Step 10: What are the barrier to implementing the action plan?
Step 9: Check the plan? Does the plan assist us in achieving our mission and goals? Step 10: What are the barrier to implementing the action plan? Stakeholders been considered? Culture, politics Buy-in Impact
20
Common Mistakes Reporting Numbers = Analysis
Meeting Goals = Good Outcomes (goals may be flawed) Outcomes Assessment = Program/Process Evaluation Side Effects (Impacts) = Negative Outcome Failure to Triangulate (> 1 data source) Failure to Evaluate the Evaluation Measures PAEA Copyright 2016 20
21
Failure to Evaluate the Process Failure to do Formative & Summative
Statistical Significance Practical Significance Summary without Recommendations Bias Towards Favorable (or Unfavorable) Findings PAEA Copyright 2016 21
22
Not defining benchmarks (arbitrary) Lack of transparency
Not including stakeholders Analysis without a follow-up action plan PAEA Copyright 2016 22
23
Helpful Hints Set realistic expectations.
Start with what you have. Set realistic expectations. Sampling is ok for many assessments. Stagger for realistic workload. Set up tracking mechanisms for longitudinal analysis. Write definitions (e.g. remediation vs. retesting). Use results to make evaluation relevant.
24
Helpful Hints Consider external issues. Triangulate. 360 evaluation.
Make decisions that are data driven and mission driven. Consider external issues. Triangulate. 360 evaluation. Look at trends. Overall pattern of change in an indicator over time…this is the unit of measure!
25
Helpful Hints Assessment does not have to be complex
Make assessment “organic” – it is part of learning process Not just summative Find your help Institution PAEA Community
26
Acknowledgements Portions of this lecture were adapted from information by: Robert Philpot Ruth Ballweg
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.