Interpreting, Editing, and Communicating MISO Survey Results

Slides:



Advertisements
Similar presentations
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Advertisements

Copyright statement Copyright David Consiglio, Pattie Orr, and Andrew White, This work is the intellectual property of the authors. Permission is.
Results from the AVID Program in Chicago Jenny Nagaoka, Jonah Deutsch, Melissa Roderick, and Andy Brake January 29, 2008.
M I L L I K I N U N I V E R S I T Y Critical Writing, Reading & Research I & II MPSL First-Year Writing Requirement Report for Academic Year
Copyright statement Copyright David Consiglio, Pattie Orr, Carol Peddie, Patricia Schoknecht, Douglas West, and Andrew White, This work is the intellectual.
ACT Question Analysis and Strategies for Science Presentation A.
LibQual 2013 Concordia University Montréal, Québec.
Spring 2013 Student Opinion Survey (SOS) Take it Seriously… YOUR OPINION COUNTS!!!
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
Faculty Satisfaction Survey Results October 2009.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Misosurvey.org Trend-spotting and Assessment Using the MISO Survey for Faculty and Students Virtual Academic Library Environment (VALE) Conference 2015.
Library Satisfaction Survey Results Spring 2008 LibQUAL Survey Analysis User Focus Team (Sharon, Mickey, Joyce, Joan C., Paula, Edith, Mark) Sidney Silverman.
Instructional Technology Survey: Highlands School District Shawn Cressler, Summer 2013.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
LibQUAL Survey Results Customer Satisfaction Survey Spring 2005 Sidney Silverman Library Bergen Community College Analysis and Presentation by Mark Thompson,
Student Employment Where Learning Happens. Today’s Agenda Overview of Learning Outcomes UWM Employment Experience – What our data says – Student Employment.
© FSAI FSAI Advice-Line Evaluation Survey of Advice-line and Query Users and Mystery Shopper Measurement Evaluation carried out by by Insight Statistical.
Year One Evaluation Outcomes
The Use of BlogTalkRadio in Online Management Classes
Engaging Online Learners for Success —Beyond the LMS
Equality Through transparency:
Summary of VCU Student Satisfaction Fall 2012
Faculty Diversity & Work Life Survey Review
Leanne Riseley & Sunny Pai Jayne Bopp & Susan Wood & Kelli Nakamura
Carol A. Leibiger Alan W. Aldrich University of South Dakota
ELI 2012 Annual Meeting February 15, 2012 Austin, Texas
Key Performance Indicator - Survey Overview
ACT Question Analysis and Strategies for Science
Innovative Teaching at Faughanvale P.S.
Boosting Faculty Instructional Technology Use
The sample under analysis at the end of the scoring was made up of 99 participants as follows:
BY DR. M. MASOOM RAZA  AND ABDUS SAMIM
NMUSD Adult School Professional Growth Plan Martha Rankin
It Takes a Community to Cultivate the Assessment Crop
The EDUCAUSE 2018 Top 10 IT Issues
CamelWeb 5.0 The Next-Generation Campus Portal
- Partnering with faculty and using their voices -
How to Use the EDUCAUSE Core Data Service to Support Student Success
Is there another way besides accreditation?
Institutional Learning Outcomes Assessment
Tell a Vision: 3 Vignettes
School Climate Data Workshop
Derek Herrmann & Ryan Smith University Assessment Services
Campus Needs Assessment
2016 Information Seekers Voice of The Customer
Imagine Success Engaging Entering Students Innovations 2009
Cognitive Abilities Test (CogAT)
Student Affairs IT Constituency Group
Survey Design & Use.
THE SIGNAL AND THE NOISE IN STUDENT SUCCESS
Sarah Lucchesi Learning Services Librarian
PRESENTED BY: Casey Frechette, Ph.D. • Sharon Austin, MSETM, MA
Jess Thompson, Program Administrator Accessible Technology Initiatives
What to do with your data?
Select Findings from the Fall 2018 Enrolled Student Survey
A tale of three surveys: How librarians, faculty and students perceive and use electronic resources March 2009 © SkillSoft Corporation 2003.
McPherson College, Fall 2017
Purpose of EPIC Evaluation Program
2009 Student Opinion Survey Results
USER’S PERCEPTION AND ATTITUDE ABOUT E- LEARNING
Student Satisfaction Survey Spring 2018 Undergraduate Trends
2017 Postgraduate Research Experience Survey (PRES) Results
Early Educator Training
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Fall 2018 Student satisfaction Survey
NHS DUDLEY CCG Latest survey results August 2018 publication.
Presentation transcript:

Interpreting, Editing, and Communicating MISO Survey Results Fred Folmer Research & Instruction Librarian / Special Projects Coordinator, Connecticut College This presentation leaves copyright of the content to the presenter. Unless otherwise noted in the materials, uploaded content carries the Creative Commons Attribution-NonCommercial-ShareAlike license, which grants usage to the general public with the stipulated criteria.

Quantitative Web-based survey helping libraries/technology organizations in higher education evaluate their services Stands for Measuring Information Service Outcomes Nonprofit survey provider, based at Bryn Mawr College In 2014, approximately 40 institutions participated What is MISO?

Asks about satisfaction with, and importance of, numerous services provided by libraries and information technology organizations Rates service providers on whether they are responsive, reliable, knowledgeable and friendly. Asks how well informed respondents feel about various topics Asks what skills respondents would be interested in learning Students: what devices owned; whether they’re used for academic or personal purposes What does MISO measure?

Questions/issues to be explored Lots of data! Different results for different populations Longitudinal, comparative Sifting through it Communication issues, especially within a merged organization Questions/issues to be explored

Top 10, Satisfaction, Faculty, 2014

Top 10, Importance, Faculty, 2014

Before drawing a conclusion or getting too far, it’s wise to: Look at the actual numbers — are tech numbers really problematic, or are they just lower than library numbers? By how much? Consider some of the differences between technical support services and library services. This is especially true in a merged organization. Take all of this in prior to communicating conclusions. Conclusions?

Very high levels of satisfaction Respondents could rate satisfaction of services as “Dissatisfied” (1), “somewhat dissatisfied” (2), “somewhat satisfied” (3) or “satisfied” (4) More than 98 percent of the services received a mean satisfaction of 3, or at least “somewhat satisfied,” from all constituencies: faculty, staff and students Very high levels of satisfaction

Overall perceptions of staff were very positive Respondents asked to consider whether they thought staff in various IS areas (archives, circulation, reference, instr. tech., computer support, phone support, IT Service Desk) were: Responsive Reliable Knowledgeable Friendly Some differences among areas, but majority agree. Overall perceptions of staff were very positive

Student perceptions of reference staff

Student perceptions of IT Service Desk staff

Overall perceptions of staff (cont’d.) What we communicated: All staff areas received average score of more than 3 out of 4 (or “somewhat agree”) that criteria were met, across all respondent populations. Overall perceptions of staff (cont’d.)

Comparing satisfaction with importance

Relatively high importance/relatively low satisfaction quadrant, faculty

Other ways to discern patterns Use past results for longitudinal data Do-it-yourself MISO’s results (has advantages and disadvantages) Combinations of results from different questions in the survey MISO’s tool to compare results with other institutions Combinations of all the above! Other ways to discern patterns

Several areas of tech service improvement over time Among faculty, mean satisfaction with wireless performance improved over 2012 by 3.38 percent Among students, mean satisfaction with wireless performance improved over 2012 by 6.84 percent Among staff, mean satisfaction for the IT Service desk improved by 11.67 percent Several areas of tech service improvement over time

Increased importance for digital image collections OVER TIME: Faculty and students rated importance of digital image collections higher than in 2012. Faculty mean score increased 9.21 percent Student mean score increased 9.28 percent CROSS-INSTITUTIONAL: Faculty and students mean scores for importance of digital images were higher than those of peer institutions. Faculty means of 2.62 versus 2.3, respectively Student means of 2.59 versus 2.23 Increased importance for digital image collections

Faculty, staff and students named information security as an area about which they felt least informed. Respondents who said they felt either “not informed” or only “somewhat informed” included: 71.19 percent of faculty 70.39 percent of staff 74.83 percent of students Several indicators point toward information security awareness as an area for attention

Information security awareness, cont’d. Other related categories, about which majorities of respondents said they were either “not informed” or “somewhat informed”: Computer viruses and spyware Technology-related privacy issues Data backup Information security awareness, cont’d.

Data backup also possibly an area for attention 42 percent of students and 39 percent of staff said they never back up their data. The most common answer for those who back up their data was “once or twice” a semester. Majorities either “interested” or “very interested” in learning more about data backup: 57.69 percent of students 60.59 percent of staff 64.41 percent of faculty Data backup also possibly an area for attention

Greater faculty interest in instructional technology CURRENT QUESTIONS: Majorities interested in learning more about: Technology in meeting spaces/classrooms Our learning platform (Moodle) OVER TIME: 11.62 percent increase over 2012 in faculty interest in classroom technology ACROSS INSTITUTIONS: Higher mean scores than peer group in: Interest in learning about Moodle Importance of instructional technology support Interest in learning about technology in meeting spaces and classrooms Greater faculty interest in instructional technology

Communication strategies, spring 2014 Issued preliminary report to IS Committee (made up of faculty and selected IS staff) Issued similar report to leadership team of library and IT directors Some items (printing services, wireless) incorporated into strategic goals for the following year Communication strategies, spring 2014

Communication strategies, fall 2015 Issued revised/final report to IS staff at all-IS meeting “Good news,” trends Final report to IS Committee; detailed reports on website Communication strategies, fall 2015

VP wrote column for newsletter

Communication tips Be thorough when interpreting numbers Don’t just settle for a first impression Be prudent about what’s shared and with whom Look for trends and aggregations Communication tips