Program Assessment 2019 POD Institute for New Faculty Developers

Slides:



Advertisements
Similar presentations
Strategies to Measure Student Writing Skills in Your Disciplines Joan Hawthorne University of North Dakota.
Advertisements

Building capacity for assessment leadership via professional development and mentoring of course coordinators Merrilyn Goos.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
External Review Team Conference Call: Preparing the Team for Off-site Review Work.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Data, Exhibits and Performance-based Assessment Systems David C. Smith, Dean Emeritus College of Education University of Florida
LEADING ONLINE: An autoethnography focused on leading an instructional focus on student learning in an online school DOCTOR OF EDUCATION WASHINGTON STATE.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
March Madness Professional Development Goals/Data Workshop.
NEW STUDENT AFFAIRS PROFESSIONALS: I GOT MY DREAM JOB….NOW WHAT?? Erin Bentrim-Tapio University of North Carolina at Greensboro Ted Elling University of.
The Importance of Professional Learning in Systems Reform AdvancED Performance Accreditation Bev Mortimer Concordia Superintendent
Is it working? Tracking outcomes of Professional Development Title I & Title IIA Statewide Conference June 3, 2016.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Academic Program Review Workshop 2017
Introduction Social ecological approach to behavior change
Presentation to Kansas State Conference: 2017
Measurement.
Assessment in student life
The Year-one Report: Principles, Issues, Implications
Evaluation Emma King.
Training Trainers and Educators Unit 8 – How to Evaluate
Right-sized Evaluation
Are Your Educational Programs Learning-Centered? Can You Measure This?
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
Clear & Convincing Evidence
Promoting Learning and Understanding for Students in Mathematics
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Evaluation of HR – Why evaluate?
Stephanie Curry, Reedley College Jarek Janio, Santa Ana College
NICC Self-Study The Road to Excellence
THE JOURNEY TO BECOMING
Drexel Assessment Conference
Using Choice and Preference to Promote Improved Behavior
Training Trainers and Educators Unit 8 – How to Evaluate
Evaluating CLE Community Legal Education Masterclass
Improving the First Year: Campus Discussion March 30, 2009
Reframing the Future: Outcomes Assessment
Logic Models and Theory of Change Models: Defining and Telling Apart
Advanced Program Learning Assessment
Planning for Continuous Improvement: The importance of goal setting
Designed for internal training use:
Creating Assessable Student Learning Outcomes
Management and Evaluation Programs for Faculty
High-impact Educational Practices: What are they?
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
Engaging Institutional Leadership
Engaging Your Stakeholders and Making the Most of Your Team
Program Review Teaching and learning committee Santa ana college
Resource: Setting up a performance management system
Administrator Evaluation Orientation
Early Career 2013: Satisying Life
Faculty Peer Review of Teaching
Assessing Academic Programs at IPFW
Auxiliary Rubrics Module 6 Activity Overview
Faculty Peer Review of Teaching
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Craig Rutan, Accreditation and Assessment Committee Chair
An Introduction to LiFE
Engaged Plenary Patricia McGee.
Optional Module 6—Scholarly Project
Selecting Evidence to Demonstrate Teaching Effectiveness
On-demand and Distance Programming
Self Assessment Arthur Chester Allen Suggs Troop Guides
Revamping the student opinion instrument for faculty
Presentation transcript:

Program Assessment 2019 POD Institute for New Faculty Developers Todd Zakrajsek, UNC Chapel Hill Ben Peterson, UNC Greensboro

Abstract If you are going to deliver programs, it is imperative that you know when the programs bring about your intended outcomes. Developing an effective program assessment plan will help with allocating resources, asking for additional funding, justifying the importance of the educational development efforts/office, and provide extremely valuable data for accreditation reports. Good outcome data is worth its weight in gold. In this session we will look at ways to assess program effectiveness and how the data might best be put to effective use on your campus.

Overview The Familiarity of Assessment Program Assessment and Strategic Planning The Kirkpatrick Model for Assessing Impact Potential Pitfalls Working with Assessment Additional Resources

The Familiarity of Assessment

Activity: The Familiarity of Assessment You’ve been chosen to be on the Provost’s Task Force for Improving Faculty Promotion and Tenure, the PTFIFPT. You may want your first task to be to change the name of the task force, but the chair insists that you get straight to business. Here are the first two questions that you are asked to consider:

Activity: The Familiarity of Assessment What standards do we/could we use to assess faculty work? What data do we/could we use to measure those standards?

Program Assessment and Strategic Planning

Center Size and Assessment Scope Large centers to “centers of one” Self-assessment to evidence of best practices Message: Collect what you can and stop….

Strategic Planning via Center Missions and Values Know or find out what is important to YOUR institution Move beyond the “sum of parts” approach What is the story of your center’s part in what is important? Using your Strategic Plan to Guide Everyday Work and Drive CTL Evaluation, Angela Linse Tomorrow at 10:45 in Alexander Message: Support YOUR campus - find out what is important to your institution.

Strategic Planning Directed at Your Constituency For whom will the programming matter? For whom will the assessment matter? Who might be “hidden” constituencies? How can you use your data to tell a story? How can your assessment help to promote center awareness? Message: Who can help you and who can you help with your data….

SMART Goals Support Assessment As with helping faculty, emphasis on backward design for assessment: Clearly stated outcomes Relevant data methods: quantitative, qualitative, or mixed methods Possibility of direct and indirect measures Message: Be mindful when setting goals on which you will collect data. You can’t fix with data what you bungle by design.

Activity: Assessing Effectiveness Ways to Assess Things to Assess 1 2 3 4 Grids on what center is doing (Handout - 3 rows) Pair and share - feedback on one cell Share an interesting example from a partner’s grid

The Kirkpatrick Model for Assessing Impact

The Common Metrics Reaction Level Assessment: Tracking Attendance, Satisfaction Surveys Message: Slides 15 - 23 Kirkpatrick

The Common Metrics

The Common Metrics Tracking Attendance Satisfaction Surveys Message: Slides 15 - 20 Kirkpatrick

Improving the Common Metrics Attendance Intentional demographic data (dept, rank, # courses taught) Context for count and types of programming Satisfaction Context for reporting data Consistency across all types of service (events, consultations, resources, etc.) Timelines and longitudinal data

Beyond the Common Metrics

Beyond the Common Metrics: Programming Standards of Quality: alignment, research-based, recursive Cohorts and longitudinal studies Institutional teaching culture Examples of extended, mixed methods, longitudinal studies

Beyond the Common Metrics: Learning and Impact Formative assessments Artifacts of faculty development Classroom Observations

Beyond the Common Metrics: Results and Teaching Culture Center reach and social data network analysis Timelines and longitudinal changes in use Qualitative data of impacts on performance

Beyond the Common Metrics: Student Learning Beyond faculty perceptions of student learning Student course evaluation bugaboo Outside specialist direct analysis Comparative artifacts and metrics Standardized questionnaires

Reminder: Center Size and Scope Resource implications and need for an institutional framework Not every center can move beyond these metrics and no one can measure all things But have a clear sense of what you can do and how to make the most of that Your assessment strategy needs to be something you can reliably do Message: restate collect what you can and stop

Potential Pitfalls

Potential Pitfalls Collection without a clear goal Collecting too much data - abandoned and unloved Survey fatigue People don’t read your reports Lack of clarity on what you should collect for data for different stakeholders Difficulty in measuring what we do… (Maybe ROI example…)

Working with Assessment

Assessment for your Contexts Levels of Assessment Center Programs/Events Reaction Learning Behavior Results 1 2 3

Assessment for your Contexts Levels of Assessment Center Programs/Events Reaction Learning Behavior Results 1 Workshop on Active Learning Paper Survey Follow-up Email x 2 Newsletter Semester Survey via Google Forms 3 New Faculty Orientation

Additional Resources

Miller’s Pyramid

Brinkerhoff’s Success Case Method Considering “outliers” at both ends for insights into program development What was the most successful case that resulted from your program? What was the least successful case that resulted from your program? Helps address two questions: “When the program works, how well does it work?” “What is working, and what is not?”

ACE/POD Matrix Rubric for different levels of center development Beginning/Developing Proficient/Functioning Accomplished/Exemplary Tool for goal-setting and assessment