The AIR National Survey of Institutional Research Offices

Slides:



Advertisements
Similar presentations
Verification SY Objectives Identify the steps required for Verification. Calculate an accurate sample size and verify the correct number of applications.
Advertisements

School Deans appoint a pool of evaluators who are trained by the Centre for University Teaching The staff member whose teaching is being evaluated is informed.
Analytics at OCLC  Goal: support the library community in decision-making activity, using transactional and cooperative data  New framework enables a.
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
TEACHING FOR CIVIC CAPACITY AND ENGAGEMENT : How Faculty Align Teaching and Purpose IARSLCE 2011 | CHICAGO Jennifer M. Domagal-Goldman | November 3, 2011.
2 1.Client protection principles 2.Principle #6 in practice 3.Two components of protecting client data 4.Participant feedback 5.Practitioner lessons and.
The Changing Role of Institutional Research: A Student Centered Imperative Dr. Randy L. Swing, Executive Director, AIR Cary, N.C. September 10, 2015.
Module 5: Data Collection. This training session contains information regarding: Audit Cycle Begins Audit Cycle Begins Questionnaire Administration Questionnaire.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Staff Meeting Monday 14 April – 8.30am Review of the IT Department Outline: the process the statistics recommendations.
Using a Help Desk Database to Identify Training Issues, Improve Customer Service, and Increase Office Efficiency Jennifer West, MPH Health Educator VaxTrack.
Stuart Hollis New Challenges, New Chances And the review of informal adult community learning.
2 1.Client protection principles 2.Principle #6 in practice 3.The client perspective 4.Participant feedback 5.Tools for improving practice 6.Conclusion.
Verification SY Objectives Identify the steps required for Verification. Calculate an accurate sample size and verify the correct number of applications.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
“To provide respectful, prompt, and accurate services for our clients.”
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
Big Data for Measuring the Information Society INTERNATIONAL TELECOMMUNICATION UNION BIG DATA PROJECT - INNOVATIVE WAYS TO UTILIZE BIG DATA AS A NEW DATA.
Special Education District Validation Review (DVR) Team Member Training and School Preparation Information.
District Validation Review (DVR) Nonpublic School Preparation Information Division of Special Education.
LHD Financial Metrics & Performance Indicators
Management Innovation – Finance
Local Points of Contact Webinar
Sustainable Entity Update #2 Discussion Document
Well Trained International
Equality and Human Rights Exchange Network
Community Ethics Network Strategic Planning Results: 2008 and 2011 Kimberley Ibarra Program Evaluation Specialist | Ethics Consultant, Toronto Central.
TAIR Conference Houston, TX | February 27, 2017 Gina Johnson, PhD
Institutional Researchers as Trainers, Coaches, and Educators
Analytics at OCLC Goal: support the library community in decision-making activity, using transactional and cooperative data New framework enables a comprehensive.
PISA 2015 Excellence and Equity in Education Peter Adams
The positive impact of personal advising training
Incident handling and transparency Duty of candour
The Lycurgus Group: Instructional Effectiveness Survey System
Quality Assurance, Careers Services and the matrix standard
MidAIR Conference Kansas City, MO | November 9, 2017
Presented by Robert Ford
EVIDENCE-BASED REWARD
Parent and Family Partnership Surveys
AFSA Chapter Officer Training
America’s Promise Evaluation What is it and what should you expect?
Parent and Family Partnership Surveys
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
SPECIAL EDUCATION DECEMBER COUNT REPORT WEBINAR
Lessons learned through restructuring
Pupil Premium Governing Body Training November 2017
Capex to Opex: Are You Ready?
Overview of AIR Special Education Program Review and Next Steps
Employee Satisfaction Surveys Survey Results July 2010
Susannah Livingood Associate Provost & Director, IRR
Celebrating 26 years of service to the Collegiate Travel Marketplace
Software Asset Management October 2010
Managing SharePoint February 2011
Butler University Great Colleges To Work For
©Joan Sedita, Kinds of PD Follow Up ©Joan Sedita,
Ohio AHEAD 2017 Business Meeting
Establishing Oversight Mechanisms
Measuring What Matters
Asset Management Overview …
Qualtrics for data collection
Sam Catherine Johnston, Senior TA Specialist National AEM Center
Developing Metrics to Assess Community Impact The Anchor Dashboard
SPECIAL EDUCATION DECEMBER COUNT
IT Service Requisition and Planning October 2010
2019 Spring & Fall Timeline May 10, 2019
Cynthia Curry, Director National AEM Center
Common reasons for partial use of DTM data
Asset Management Overview …
Program Development for Head Start Associations
Presentation transcript:

The AIR National Survey of Institutional Research Offices AIR National Survey of IR Offices The AIR National Survey of Institutional Research Offices Leah Ewing Ross Senior Director for Research and Initiatives, AIR

Evolution of the National Survey AIR National Survey of IR Offices Evolution of the National Survey 2015-2016 2018-2019

AIR National Survey of IR Offices Why Benchmark?

AIR National Survey of IR Offices How much work is your office producing in comparison? How does your staff size compare? Is something wrong? Are you serving the same stakeholders? How can we improve? Is there sufficient data capacity and literacy at your institution? Are staff adequately trained to produce quality work?

Leveraging Benchmarking Data AIR National Survey of IR Offices Leveraging Benchmarking Data Put the right information into the hands of the right people at the right time for the right reason. Leverage that information to initiate and sustain improvement.

Data Collection and Reporting AIR National Survey of IR Offices Data Collection and Reporting

AIR National Survey of IR Offices Contracted with Dynamic Benchmarking One response per office Once 300 responses completed, we will activate reporting To protect confidentiality, reports won’t generate if fewer than 5 responses Launched October 23; project will occur every two years

AIR National Survey of IR Offices Filters

AIR National Survey of IR Offices Reports

Longitudinal Benchmarking AIR National Survey of IR Offices Longitudinal Benchmarking My Office had 1.50 Staff FTE in 2015 which increased to 2.33 in 2018

Using Benchmarking Data to Inform Your Office’s Work AIR National Survey of IR Offices Using Benchmarking Data to Inform Your Office’s Work

What staff roles should we have? AIR National Survey of IR Offices What staff roles should we have? Compare Offices >> Staff Details >> Staff Positions

AIR National Survey of IR Offices What does my Office do? My Office 45% reporting 10% decision support Peer Offices 23% reporting 24% decision support Office Structure and Work

Where do staff spend their time? AIR National Survey of IR Offices Where do staff spend their time? 10 hours conducting basic analytics 11 hours collecting and managing data 21 hours collecting and managing data 7 hours conducting basic analytics Office Structure and Work

AIR National Survey of IR Offices Participation to Date All Participants & California Participants

AIR National Survey of IR Offices Data Collection Data Collection Launched October 23, 2018 All Respondents 800 have started Of those, 292 are complete As of Thursday, November 15 CA Respondents 67 have started Of those, 17 are complete As of Thursday, November 15

AIR National Survey of IR Offices Summary

AIR National Survey of IR Offices Power of Benchmarking Evidence that higher levels of performance is possible “Is something wrong?” to “How can we improve?” Challenges long-held beliefs Informs decision making Motivates staff In summary, there is power in benchmarking your office against your peers. Benchmarking provides evidence that higher levels of performance is possible. It changes the never-ending debate of “Is something wrong?” to an action-driven conversation of “How can we improve?” It challenges long-held beliefs about how an Office should operate and it informs your decision making. Ultimately, it motivates staff since they will see the positive change in their work environment.

AIR National Survey of IR Offices The Nuts and Bolts www.airweb.org/NationalSurvey-AO Learn how to participate Ensure we have the correct contact info for your institution Benchmarking options (special pricing for CAIR members) nsiro@airweb.org

Questions and Discussion AIR National Survey of IR Offices Questions and Discussion Thank You! nsiro@airweb.org lross@airweb.org

AIR National Survey of IR Offices