We’re lost, but we’re making good time Yogi Berra.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Non-Classroom Teacher Evaluation Guidelines. The single most influential component of an effective school is the individual teachers within that school.
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Creating a Culture of Assessment in Financial Aid Erika M. Cox Associate Director, Student Financial Aid and Enrollment Services University of Texas at.
1 Student Engagement in Times of Crisis. Session Aims By the end of the session you will be able to: analyse what kind of relationship you have with your.
Training Evaluation Presentation by Ranjith Menon.
Assessment in Student Affairs at The University of Memphis Active learning: Comment at Twitter #umassess Polleverywhere = pollev.com; text from phone Take.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
1 Mystery Shopping SHIP Directors’ Conference June 11, 2007 Julie Leonard & Erika Melman BearingPoint, Inc.
Maximizing Your NSSE & CCSSE Results
Commission for Academic Accreditation 1 Accreditation and Academic Quality King Fahd University of Petroleum and Minerals Faculty Workshop Accreditation,
Evaluating Teaching and Learning Linda Carey Centre for Educational Development Queen’s University Belfast 1.
Project Monitoring Evaluation and Assessment
Say what?! Getting and Responding to Feedback in the Pre-award Office Joseph McNicholas, Ph.D. Director Office for Research and Sponsored Projects Loyola.
An Assessment Primer Fall 2007 Click here to begin.
Collective Opportunity in Collaboration. Reasons to Rejoice Funders usually like To impact large numbers. To impact large geographic areas. To interact.
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
How to manage your advisor (and one day, how to manage your student.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.
1 General Education Assessment at Cleveland State University What We Have Accomplished What We Have Yet to Do.
Student Opinion Survey It’s required…how do we make meaning out of it? How can results be the start of action?
Using Measurable Outcomes to Evaluate Tutor Programs Jan Norton, Presenter.
UOFYE Assessment Retreat
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Formal Appraisal of Undergraduates – worth the effort? Deborah Murdoch-Eaton Professor of Medical Education Leeds.
Two Strategies for Developing Solid Referral Relationships A Complete Training Series.
Closing the Loop in Assessment.
 Urban public schools struggle with low achievement and high dropout rates.  Motivation is linked to higher achievement and increased participation.
2010 CCCSE Workshop Students Speak – We Listen June 1, 2010.
Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Learning within Teaching What professors can learn about their students and themselves as teachers when they innovate in their teaching ANABELLA MARTINEZ,
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessment Workshop College of San Mateo February 2006.
A New Look at Rotary Membership Growth: Membership 101: Our Members Are Our Customers Thanks to Zone 30 Assistant Rotary Coordinator Brent D. Rosenthal.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
TYPES OF EVALUATION Types of evaluations ask different questions and focus on different purposes. This list is meant to be illustrative rather than exhaustive.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Using a Classroom Response System to Transform Student Engagement HEA Annual Conference, Warwick 3 July 2013 Jeff Waldock Department of Engineering and.
IS2210: Systems Analysis and Systems Design and Change Twitter:
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
ASSESSMENT DO THIS, NOT THAT! UNIVERSITY AT BUFFALO ASSESSMENT DAY
Human Subjects Protection Program Office of Research Compliance Navigating through the current HSPP and IRB Presented by: Danielle Griffin, M.S. Research.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
QUALITY MANAGEMENT IN HUMAN RESOURCE. Quote, “… "Outstanding leaders go out of the way to boost the self-esteem of their personnel. If people believe in.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Motivating to Perform in the Workplace (ILM Level 3 Unit M3.12) Rebecca Johnson Corporate Learning & Development.
Where the Family Fits Engaging Families Afterschool
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Providing Customized Training on Quality Online Design and Delivery
Sarah Lucchesi Learning Services Librarian
What to do with your data?
Student Engagement in Times of Crisis
REACH Accreditation Preparing Your School for a Team Visit
Presentation transcript:

We’re lost, but we’re making good time Yogi Berra

Assessing the Institutional Research Office A conversation at the summer, 2008, AIRPO conference Questions rather than answers (sorry, Charlie)

What Gets Measured Gets Improved Anonymous

We are the shoe maker’s children. Always helping others assess, but rarely assessing ourselves

What is our part in the Institutional Effectiveness play?

Purpose & Outcomes Purpose of this presentation Think (not do, alas) about assessing our units in the context of assessing non-instructional units Outcomes for the presentation Agreement to take a next step at cooperative effort, or not

Should we assess ourselves? If we do not, how shall we improve? If we do not, how will we ever find out how it feels to be assessed? If we do, how do we assure we learn something from the assessment? If we do, how do we make it productive and not just activity?

Why assess? What reasons do you give your faculty?

How to assess? The usual suspects: –Hey kids, let’s put on a survey! –Ask your customers –Ask your college as a whole –Qualitative –Quantative

What to assess? The validity/reliability of the data provided? The speed with which you provided the data? The IR office’s contribution to student learning? The support to the direct contributors to student learning, i.e., faculty & student life types?

What is our product? Who are our customers?

Who are you? Who are you and what do you do? What is your mission? What is inside your circle? To what do you say ‘no’? What are your current goals? How do they fit (or not) within your mission? Do you accomplish your goals? How do you know?

Are you any good? What information do you have that will tell you if you are any good? Do these improvements move you to where you want to go? Do you improve as a result of this information? Where do you want to go, anyway?

What is our role in the education of our students? Do we interact with student directly? Do we impact their learning in any way?

What methods should we use? Quantitative? Unobtrusive? Qualitative?

Rubric Does this have promise? Can we define the elements of our mission? Can we define what –Exceeds? –Meets? –Approaches? –Does not meet?

What information will cause you to take action? What kind of action do you have the power to take?

Modes of assessment (adapted from Harris & Bell, 1986) Formal vs. informal Formal assessment activities or informal judgments/observations or measuring something unobtrusive Formative vs. summative Along the way or at some ending, like the end of the school year

Modes of assessment (cont.) Process or product The report itself or the way the report got requested/assigned/completed Criterion-referenced or norm- referenced A pre-determined standard or a comparison with peers

Modes of assessment (cont.) Individual focused vs. group focused The office itself or the office in the context of the institution Learner-judged or teacher-judged The customer’s opinion or the office staff’s opinion

Modes of assessment (cont.) Internal vs. external Inside the office or outside the office/institution Maybe it’s not either/or but and/or?

Random Thoughts As if this wasn’t already random… Many of our customers aren’t survey- able Of our customers who are, what makes them happy? Survey fatigue Our sense of where we are…

More random thoughts… IR varies wildly in our schools – does that matter? Does it change fundamentally the need for assessment or the kind of assessment we do? What responsibility do we have for the data? Is our relationship with IT part of what needs to be assessed? What responsibility do we have for how our customers use our data?

More random thoughts… At what scale do we conduct assessment? Nomenclature: assessment, evaluation, institutional effectiveness, program review When do we do a self study?

Mike Middaugh’s metrics Average total compensation at/above median of peers within 5 years Total financial aid increase by 100% within 5 years Student satisfaction shows significant gains within 5 years Commit to set aside X% of resources to a goal

The IR analogies?

Metrics? # of external surveys submitted on time with data accurate # of internal data requests completed on time successfully # of times analyses contributed to institutional action (that contributed to increase in retention and graduation, for example # of times had to recall data due to error

Seeds of a modest proposal (thanks to Jennifer Gray and John Porter)   Create a template We can use or create variants Responsible to AIRPO? Role of System Administration? Recruit volunteer evaluators Middle States model We all get to see each other in action Benefits accrue to both evaluated and evaluators

Logistics How much time should it take? How will evaluators be compensated? Do schools have to agree to participate and support? What force/strength does the evaluation have? What is the role of System Administration?

References AIR professional file #80 =73&apppage=85&id=83

"If you don't know where you are going, you will wind up somewhere else." Yogi Berra

Notes from session as of 06/23/08 Possible steps toward an assessment of the IR office: Customer satisfaction feedback Survey after each project Survey at some ending point like the end of a school year Focus group Personal interviews Purpose both to get feedback and to educate customers as to good data, etc. Unobtrusive data Analyze your request database – who asks for what? Should what gets asked for be moved into some kind of routine report? Other possibilities? Scale is an important concept. Maybe IR does not have discrete assessment activities, but instead folds into its unit. Does the unit to which you report conduct its own assessment? If so, could the IR office fit in? If not, could the IR office lead the way? What is in our circle? To what do we say ‘no’? Where do requests from students fit in? All requests should be in writing. To whom do the data belong? Re IR peer review: Other professional organizations sponsor this – i.e., AACRAO. Could AIRPO/NEAIR lead the way for IR? Look for questions on the conference evaluation survey.