Download presentation
Presentation is loading. Please wait.
Published byÏἈχαϊκός Σαμαράς Modified over 5 years ago
1
Program Assessment 2019 POD Institute for New Faculty Developers
Todd Zakrajsek, UNC Chapel Hill Ben Peterson, UNC Greensboro
2
Abstract If you are going to deliver programs, it is imperative that you know when the programs bring about your intended outcomes. Developing an effective program assessment plan will help with allocating resources, asking for additional funding, justifying the importance of the educational development efforts/office, and provide extremely valuable data for accreditation reports. Good outcome data is worth its weight in gold. In this session we will look at ways to assess program effectiveness and how the data might best be put to effective use on your campus.
3
Overview The Familiarity of Assessment
Program Assessment and Strategic Planning The Kirkpatrick Model for Assessing Impact Potential Pitfalls Working with Assessment Additional Resources
4
The Familiarity of Assessment
5
Activity: The Familiarity of Assessment
You’ve been chosen to be on the Provost’s Task Force for Improving Faculty Promotion and Tenure, the PTFIFPT. You may want your first task to be to change the name of the task force, but the chair insists that you get straight to business. Here are the first two questions that you are asked to consider:
6
Activity: The Familiarity of Assessment
What standards do we/could we use to assess faculty work? What data do we/could we use to measure those standards?
7
Program Assessment and Strategic Planning
8
Center Size and Assessment Scope
Large centers to “centers of one” Self-assessment to evidence of best practices Message: Collect what you can and stop….
9
Strategic Planning via Center Missions and Values
Know or find out what is important to YOUR institution Move beyond the “sum of parts” approach What is the story of your center’s part in what is important? Using your Strategic Plan to Guide Everyday Work and Drive CTL Evaluation, Angela Linse Tomorrow at 10:45 in Alexander Message: Support YOUR campus - find out what is important to your institution.
10
Strategic Planning Directed at Your Constituency
For whom will the programming matter? For whom will the assessment matter? Who might be “hidden” constituencies? How can you use your data to tell a story? How can your assessment help to promote center awareness? Message: Who can help you and who can you help with your data….
11
SMART Goals Support Assessment
As with helping faculty, emphasis on backward design for assessment: Clearly stated outcomes Relevant data methods: quantitative, qualitative, or mixed methods Possibility of direct and indirect measures Message: Be mindful when setting goals on which you will collect data. You can’t fix with data what you bungle by design.
12
Activity: Assessing Effectiveness
Ways to Assess Things to Assess 1 2 3 4 Grids on what center is doing (Handout - 3 rows) Pair and share - feedback on one cell Share an interesting example from a partner’s grid
13
The Kirkpatrick Model for Assessing Impact
14
The Common Metrics Reaction Level Assessment: Tracking Attendance, Satisfaction Surveys Message: Slides Kirkpatrick
15
The Common Metrics
16
The Common Metrics Tracking Attendance Satisfaction Surveys
Message: Slides Kirkpatrick
17
Improving the Common Metrics
Attendance Intentional demographic data (dept, rank, # courses taught) Context for count and types of programming Satisfaction Context for reporting data Consistency across all types of service (events, consultations, resources, etc.) Timelines and longitudinal data
18
Beyond the Common Metrics
19
Beyond the Common Metrics: Programming
Standards of Quality: alignment, research-based, recursive Cohorts and longitudinal studies Institutional teaching culture Examples of extended, mixed methods, longitudinal studies
20
Beyond the Common Metrics: Learning and Impact
Formative assessments Artifacts of faculty development Classroom Observations
21
Beyond the Common Metrics: Results and Teaching Culture
Center reach and social data network analysis Timelines and longitudinal changes in use Qualitative data of impacts on performance
22
Beyond the Common Metrics: Student Learning
Beyond faculty perceptions of student learning Student course evaluation bugaboo Outside specialist direct analysis Comparative artifacts and metrics Standardized questionnaires
23
Reminder: Center Size and Scope
Resource implications and need for an institutional framework Not every center can move beyond these metrics and no one can measure all things But have a clear sense of what you can do and how to make the most of that Your assessment strategy needs to be something you can reliably do Message: restate collect what you can and stop
24
Potential Pitfalls
25
Potential Pitfalls Collection without a clear goal
Collecting too much data - abandoned and unloved Survey fatigue People don’t read your reports Lack of clarity on what you should collect for data for different stakeholders Difficulty in measuring what we do… (Maybe ROI example…)
26
Working with Assessment
27
Assessment for your Contexts
Levels of Assessment Center Programs/Events Reaction Learning Behavior Results 1 2 3
28
Assessment for your Contexts
Levels of Assessment Center Programs/Events Reaction Learning Behavior Results 1 Workshop on Active Learning Paper Survey Follow-up x 2 Newsletter Semester Survey via Google Forms 3 New Faculty Orientation
29
Additional Resources
30
Miller’s Pyramid
31
Brinkerhoff’s Success Case Method
Considering “outliers” at both ends for insights into program development What was the most successful case that resulted from your program? What was the least successful case that resulted from your program? Helps address two questions: “When the program works, how well does it work?” “What is working, and what is not?”
32
ACE/POD Matrix Rubric for different levels of center development
Beginning/Developing Proficient/Functioning Accomplished/Exemplary Tool for goal-setting and assessment
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.