Download presentation
Presentation is loading. Please wait.
Published byClara Leonard Modified over 6 years ago
1
An Introduction to the MISO Survey Joshua Wilson
We are doing three segments on the MISO Survey that build upon the information. Please hold questions until the end, when we’ve allotted time with Dave Consiglio, the founder of the survey, to answer questions along with the presenters.
2
Introduction: What is the MISO Survey?
Web-based quantitative survey Measures how faculty, students, and staff view library and technology services in academia A valid, reliable, and cross-institutional assessment tool Importance of technology and library services in the same survey The MISO Survey- Measuring Information Service Outcomes- is a web-based quantitative survey designed to measure how faculty, students, and staff view library and technology services at colleges and universities. The survey is guided by a philosophy of providing a valid, reliable, and cross-institutional assessment tool to libraries and technology organizations in higher education. Including library and technology services and resources in the same instrument permits the comparison of results across a greater range of the core teaching, learning, and research service landscape in higher education, and it recognizes the inextricable link between library and technology services. Therefore, by including these services in the same survey, more context is given to the results observed.
3
MISO Team Dave Consiglio (Director), Bryn Mawr College
Katherine Furlong (Associate Director), Susquehanna University Gentry Lankewicz Holbert (Associate Director), Spring Hill College Craig Milberg (Associate Director), Davidson College Kevin Reynolds (Associate Director), Wofford College Josh Wilson (Associate Director), Brandeis University Jean Lacovara (MISO Survey Specialist) The MISO Survey is managed by a volunteer team of individuals working in IT and libraries in academia. Because the team is comprised of practitioners, we are well connected with the services we are surveying and trends in the field. Additionally, we are highly collaborative, both among the team and with those administering the survey. Institutions don’t sign up for the survey only to be left to their own devices to administer it. The team is presently comprised of six volunteers and one paid staff member, Jean Lacovara, who is the MISO Survey Specialist. The volunteer management team includes: Dave Consiglio, Katherine Furlong, Gentry Lankewicz Holbert, Craig Milberg, Josh Wilson and me. Neal Baker, who is presenting with us, is a former team member.
4
History of the Survey First developed in 2005
Originally rooted in liberal arts colleges Variety of institutions participating 112 participating institutions 157,000 faculty, student, and staff responses The MISO Survey was originally developed in 2005 at Bryn Mawr College by a consortium of higher education institutions who desired to assess library and technology services together in the same instrument. Five institutions participated in the initial pilot Early participants were mostly liberal arts institutions in the United States. Mostly small, primarily undergraduate institutions. Since then, the MISO Survey has been employed by 112 higher education institutions of varying types and sizes. All participating institutions are located in the United States. These range from large research institutions to community colleges to liberal arts colleges and universities. The data set includes approximately 157,000 faculty, student, and staff responses
5
Measurements of Organizational Effectiveness
Use, Importance, Satisfaction Communication Academic Impact With nearly 400 possible points of measurement, the MISO Survey allows institutions to gauge the effectiveness of library and technology services and resources from the perspective of the constituents using those services and resources. A significant portion of the survey is dedicated to measuring the frequency of use, the level of importance, and the level of satisfaction constituents have with the services and resources. Any one of these inputs can provide valuable information alone, but when institutions begin to use these data together, the picture that emerges is one that is clear and on which decisions such as prioritization of resources can be made soundly. In addition, the Survey gauges how well our constituents believe we communicate with them by delving into their perception of how informed they believe they are surrounding our decision making and our services. Today, all of our organizations are concerned not only with how well our constituents believe we are delivering the services and resources they need, but also with the impact our services and resources have on the academic and scholarly pursuits of our students and faculty. Because the MISO Survey evolves with our ever-changing landscape and we understand that many of our institutions are trying to gauge these things, this year, we will be releasing a series of items that participating institutions can use to measure the academic impact of our library and technology services and resources.
6
Constituent Behavior, Interests, and Demographics
Which academic tools do our constituents use, and how? How skilled are our constituents in the use of academic tools? What additional skills do they wish to learn, and how? Constituent demographics In addition to gaining an understanding of how our constituents think we are doing in providing services and resources, the Survey also provides a great deal of insight into our constituents, how they work, and their needs. Among other things, these include the following: How skilled are our constituents in the use of software and library databases? What additional skills do they wish to learn, and how do they wish to learn? Which software and hardware tools do our constituents use, and which of these tools do they own? What roles do our constituents play on campus? What demographic factors identify them? As you think about all the data we collect with the MISO Survey, from dozens of institutions, it naturally raises the question of how can the data be used for comparison both locally and nationally. Craig is going to pick up with that point.
7
Local Needs and National Benchmarking
Balance between the needs of local customization and benchmarking Large body of tested items Few required items Local items Benchmark over time and against peers For some institutions there is a tension between needing data on specific local issues and benchmarking against peers when designing a survey MISO has a broad choice of items (almost 400) This means campuses will often will find standardized tested items that meet local needs and still allow benchmarking Relatively few required items (30?). They are required so that we have baseline items for comparison across institutions. Required items are about core IT and library services. Institutions Can add local items to fit into the survey as entirely new questions or items that fit into existing questions Benchmark over time and against peers Get results workbooks on an ongoing basis
8
Sampling and Response Rates
Campus Survey Administrator Survey length Sampling Response Rates Great questions don’t mean anything if you can’t get responses from appropriate subjects l The campus survey administrator is critical to carrying out the survey. The CSA acts as the primary point of contact between institution and the MISO Survey Team, manages all aspects of Survey preparation at your institution, serves as the primary resource for members of your campus community regarding their participation in the Survey Training at regional meetings on a yearly basis (first time CSA/institutions strongly encourage to attend Ongoing Support from liaison Survey length is important want to find the sweet spot between number of items that can be asked vs. response rates. Frustrated users drop out of surveys and different populations have different frustration thresholds MISO supplies a Decision spread sheet keeps you honest about length of survey using easy visual cues for each population while you pick the questions to include. Sampling – you provide information and MISO does the sampling for you. Consistency and ease For most institutions Faculty and staff are the full populations. Students are random stratified samples in many cases. For small institutions it may be all students. Response rate are maximized in the survey. Custom messages – appear to come from CSA with personalized touches
9
Response Rates Here are the median response rates for the survey for the last 5 years. These rates are quite good for surveys of this type of survey
10
Instrument Development
Knowledge of management team/practitioners Client feedback Local items for inclusion Questions and Items are rigorously tested Experienced senior IT and Library folks 147 years of experience on the current team. Team works on this on a weekly basis Each client has a liaison on the management team. Relevant feedback is recorded, shared and acted on as appropriate We examine local items. Items used frequently are evaluated for inclusion as part of the core survey Rigorous testing - human testing focus groups and analytical testing factor analysis
11
Delivering Results Detailed Data for Institution Results Workbooks
MISO Survey Results Website (coming soon) Detailed identified data for each population for your institution. SPSS, CSV PDF including summaries Results Workbooks (excel) allow clients to pick peer groups to benchmark against. Convenient for non-statisticians and/or quick analyses Also benchmark across time. Displays Number of responses, means, standard deviation, and statistical difference tests Can see aggregate data by year MISO Survey Results Website coming soon. Data set getting big to handle workbooks. Web based system for delivering the same functionality as the results workbook above (not detailed identified data) You can do all kinds of interesting local analysis and national research with the results the next 2 papers are going give examples of both
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.