Download presentation
Presentation is loading. Please wait.
1
We’re lost, but we’re making good time Yogi Berra
2
Assessing the Institutional Research Office A conversation at the summer, 2008, AIRPO conference Questions rather than answers (sorry, Charlie)
3
What Gets Measured Gets Improved Anonymous
4
We are the shoe maker’s children. Always helping others assess, but rarely assessing ourselves
5
What is our part in the Institutional Effectiveness play?
6
Purpose & Outcomes Purpose of this presentation Think (not do, alas) about assessing our units in the context of assessing non-instructional units Outcomes for the presentation Agreement to take a next step at cooperative effort, or not
7
Should we assess ourselves? If we do not, how shall we improve? If we do not, how will we ever find out how it feels to be assessed? If we do, how do we assure we learn something from the assessment? If we do, how do we make it productive and not just activity?
8
Why assess? What reasons do you give your faculty?
9
How to assess? The usual suspects: –Hey kids, let’s put on a survey! –Ask your customers –Ask your college as a whole –Qualitative –Quantative
10
What to assess? The validity/reliability of the data provided? The speed with which you provided the data? The IR office’s contribution to student learning? The support to the direct contributors to student learning, i.e., faculty & student life types?
11
What is our product? Who are our customers?
12
Who are you? Who are you and what do you do? What is your mission? What is inside your circle? To what do you say ‘no’? What are your current goals? How do they fit (or not) within your mission? Do you accomplish your goals? How do you know?
13
Are you any good? What information do you have that will tell you if you are any good? Do these improvements move you to where you want to go? Do you improve as a result of this information? Where do you want to go, anyway?
14
What is our role in the education of our students? Do we interact with student directly? Do we impact their learning in any way?
15
What methods should we use? Quantitative? Unobtrusive? Qualitative?
16
Rubric Does this have promise? Can we define the elements of our mission? Can we define what –Exceeds? –Meets? –Approaches? –Does not meet?
17
What information will cause you to take action? What kind of action do you have the power to take?
18
Modes of assessment (adapted from Harris & Bell, 1986) Formal vs. informal Formal assessment activities or informal judgments/observations or measuring something unobtrusive Formative vs. summative Along the way or at some ending, like the end of the school year
19
Modes of assessment (cont.) Process or product The report itself or the way the report got requested/assigned/completed Criterion-referenced or norm- referenced A pre-determined standard or a comparison with peers
20
Modes of assessment (cont.) Individual focused vs. group focused The office itself or the office in the context of the institution Learner-judged or teacher-judged The customer’s opinion or the office staff’s opinion
21
Modes of assessment (cont.) Internal vs. external Inside the office or outside the office/institution Maybe it’s not either/or but and/or?
22
Random Thoughts As if this wasn’t already random… Many of our customers aren’t survey- able Of our customers who are, what makes them happy? Survey fatigue Our sense of where we are…
23
More random thoughts… IR varies wildly in our schools – does that matter? Does it change fundamentally the need for assessment or the kind of assessment we do? What responsibility do we have for the data? Is our relationship with IT part of what needs to be assessed? What responsibility do we have for how our customers use our data?
24
More random thoughts… At what scale do we conduct assessment? Nomenclature: assessment, evaluation, institutional effectiveness, program review When do we do a self study?
25
Mike Middaugh’s metrics Average total compensation at/above median of peers within 5 years Total financial aid increase by 100% within 5 years Student satisfaction shows significant gains within 5 years Commit to set aside X% of resources to a goal
26
The IR analogies?
27
Metrics? # of external surveys submitted on time with data accurate # of internal data requests completed on time successfully # of times analyses contributed to institutional action (that contributed to increase in retention and graduation, for example # of times had to recall data due to error
28
Seeds of a modest proposal (thanks to Jennifer Gray and John Porter) Create a template We can use or create variants Responsible to AIRPO? Role of System Administration? Recruit volunteer evaluators Middle States model We all get to see each other in action Benefits accrue to both evaluated and evaluators
29
Logistics How much time should it take? How will evaluators be compensated? Do schools have to agree to participate and support? What force/strength does the evaluation have? What is the role of System Administration?
30
References AIR professional file #80 http://www.airweb.org/page.asp?page =73&apppage=85&id=83
31
"If you don't know where you are going, you will wind up somewhere else." Yogi Berra
32
Notes from session as of 06/23/08 Possible steps toward an assessment of the IR office: Customer satisfaction feedback Survey after each project Survey at some ending point like the end of a school year Focus group Personal interviews Purpose both to get feedback and to educate customers as to good data, etc. Unobtrusive data Analyze your request database – who asks for what? Should what gets asked for be moved into some kind of routine report? Other possibilities? Scale is an important concept. Maybe IR does not have discrete assessment activities, but instead folds into its unit. Does the unit to which you report conduct its own assessment? If so, could the IR office fit in? If not, could the IR office lead the way? What is in our circle? To what do we say ‘no’? Where do requests from students fit in? All requests should be in writing. To whom do the data belong? Re IR peer review: Other professional organizations sponsor this – i.e., AACRAO. Could AIRPO/NEAIR lead the way for IR? Look for questions on the conference evaluation survey.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.