An Introduction to the MISO Survey Joshua Wilson

Slides:



Advertisements
Similar presentations
Training Practitioner Adjuncts: A Model for Increasing Educator Effectiveness Paul C. Jackson DM, PE Peg Jackson, DPA, CPCU.
Advertisements

Julia Bauder, Grinnell College & Jenny Emanuel, University of Illinois Be Where our Faculty Are: Emerging Technology Use and Faculty Information Seeking.
1 Evaluation. 2 Evaluating The Organization Effective evaluation begins at the organizational level. It starts with a strategic plan that has been carefully.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Misosurvey.org Trend-spotting and Assessment Using the MISO Survey for Faculty and Students Virtual Academic Library Environment (VALE) Conference 2015.
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Our 2005 Survey Results. “….only customers judge quality; all other judgments are essentially irrelevant” Delivering Quality Service : Balancing Customer.
Should Ofsted hold schools to account for teacher workload and development? Peter Sellen November
District Engagement with the WIDA ELP Standards and ACCESS for ELLs®: Survey Findings and Professional Development Implications Naomi Lee, WIDA Research.
Adult Student Match.
Advisor: Dr. Richard Fanjoy
AUB Alumni Survey Report 2016
Mgt Project Portfolio Management and the PMO Module 8 - Fundamentals of the Program Management Office Dr. Alan C. Maltz Howe School of Technology.
Community Survey Report
7 Training Employees What Do I Need to Know?
Quality Assurance processes
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
What is HEA Fellowship? What’s the UK PSF?
ELI 2012 Annual Meeting February 15, 2012 Austin, Texas
Interpreting, Editing, and Communicating MISO Survey Results
An agency of the Office of the Secretary of Education and the Arts
Research from the NCCSD: What’s new and exciting?
Director, Center for Teaching, Learning, & Assessment
A nationwide US student survey
Director of Policy Analysis and Research
UW Libraries Patron Personas
Redesigning College Teaching at Sacramento State University
Oklahoma 4-H Making a Difference
NSSE Results for Faculty
The University of Texas-Pan American
Assessing Library Performance:
National PTA School of Excellence
EDUCAUSE, Tuesday, October 15, 2013
2017 National Survey of Student Engagement (NSSE)
Director, Institutional Research
UK First Year Experience: Wildcat Foundations
Business and Management Research
Tell a Vision: 3 Vignettes
Derek Herrmann & Ryan Smith University Assessment Services
Governance and leadership roles for equality and diversity in Colleges
Building a GER Toolbox As you return from break, please reassemble into your working groups: Surveys and Instruments Analytical Tools Getting Published.
Faculty use of digital resources and its impact on digital libraries
Designed for internal training use:
UTRGV 2018 National Survey of Student Engagement (NSSE)
Survey Design & Use.
Sarah Lucchesi Learning Services Librarian
NETT Recruitment-Admissions Interactive Review Congruence Survey for case study 2 Relationship between recruitment and admissions activity.
Preparing a PROFILOR® Feedback Report
Butler University Great Colleges To Work For
Presented by: Skyline College SLOAC Committee Fall 2007
Assignment Design Workshop
National PTA School of Excellence
©Joan Sedita, Kinds of PD Follow Up ©Joan Sedita,
2018 Great Colleges Survey for Champlain College
Customer Satisfaction Survey: Volunteer Training Overview
EAC Education Committee
digital libraries and human information behavior
IDEA Student Ratings of Instruction
Welcome to Your New Position As An Instructor
Undergraduate Admissions Office
Learning Community II Survey
SLOs, Curriculum, and Other Things that Shape Your Classroom
Module 4 Using Data for Decision Making
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Revamping the student opinion instrument for faculty
Presentation transcript:

An Introduction to the MISO Survey Joshua Wilson We are doing three segments on the MISO Survey that build upon the information. Please hold questions until the end, when we’ve allotted time with Dave Consiglio, the founder of the survey, to answer questions along with the presenters.

Introduction: What is the MISO Survey? Web-based quantitative survey Measures how faculty, students, and staff view library and technology services in academia A valid, reliable, and cross-institutional assessment tool Importance of technology and library services in the same survey The MISO Survey- Measuring Information Service Outcomes- is a web-based quantitative survey designed to measure how faculty, students, and staff view library and technology services at colleges and universities. The survey is guided by a philosophy of providing a valid, reliable, and cross-institutional assessment tool to libraries and technology organizations in higher education. Including library and technology services and resources in the same instrument permits the comparison of results across a greater range of the core teaching, learning, and research service landscape in higher education, and it recognizes the inextricable link between library and technology services. Therefore, by including these services in the same survey, more context is given to the results observed.

MISO Team Dave Consiglio (Director), Bryn Mawr College Katherine Furlong (Associate Director), Susquehanna University Gentry Lankewicz Holbert (Associate Director), Spring Hill College Craig Milberg (Associate Director), Davidson College Kevin Reynolds (Associate Director), Wofford College Josh Wilson (Associate Director), Brandeis University Jean Lacovara (MISO Survey Specialist) The MISO Survey is managed by a volunteer team of individuals working in IT and libraries in academia. Because the team is comprised of practitioners, we are well connected with the services we are surveying and trends in the field. Additionally, we are highly collaborative, both among the team and with those administering the survey. Institutions don’t sign up for the survey only to be left to their own devices to administer it. The team is presently comprised of six volunteers and one paid staff member, Jean Lacovara, who is the MISO Survey Specialist. The volunteer management team includes: Dave Consiglio, Katherine Furlong, Gentry Lankewicz Holbert, Craig Milberg, Josh Wilson and me. Neal Baker, who is presenting with us, is a former team member.

History of the Survey First developed in 2005 Originally rooted in liberal arts colleges Variety of institutions participating 112 participating institutions 157,000 faculty, student, and staff responses The MISO Survey was originally developed in 2005 at Bryn Mawr College by a consortium of higher education institutions who desired to assess library and technology services together in the same instrument. Five institutions participated in the initial pilot Early participants were mostly liberal arts institutions in the United States. Mostly small, primarily undergraduate institutions. Since then, the MISO Survey has been employed by 112 higher education institutions of varying types and sizes. All participating institutions are located in the United States. These range from large research institutions to community colleges to liberal arts colleges and universities. The data set includes approximately 157,000 faculty, student, and staff responses

Measurements of Organizational Effectiveness Use, Importance, Satisfaction Communication Academic Impact   With nearly 400 possible points of measurement, the MISO Survey allows institutions to gauge the effectiveness of library and technology services and resources from the perspective of the constituents using those services and resources. A significant portion of the survey is dedicated to measuring the frequency of use, the level of importance, and the level of satisfaction constituents have with the services and resources. Any one of these inputs can provide valuable information alone, but when institutions begin to use these data together, the picture that emerges is one that is clear and on which decisions such as prioritization of resources can be made soundly. In addition, the Survey gauges how well our constituents believe we communicate with them by delving into their perception of how informed they believe they are surrounding our decision making and our services. Today, all of our organizations are concerned not only with how well our constituents believe we are delivering the services and resources they need, but also with the impact our services and resources have on the academic and scholarly pursuits of our students and faculty. Because the MISO Survey evolves with our ever-changing landscape and we understand that many of our institutions are trying to gauge these things, this year, we will be releasing a series of items that participating institutions can use to measure the academic impact of our library and technology services and resources.

Constituent Behavior, Interests, and Demographics Which academic tools do our constituents use, and how? How skilled are our constituents in the use of academic tools? What additional skills do they wish to learn, and how? Constituent demographics In addition to gaining an understanding of how our constituents think we are doing in providing services and resources, the Survey also provides a great deal of insight into our constituents, how they work, and their needs. Among other things, these include the following: How skilled are our constituents in the use of software and library databases? What additional skills do they wish to learn, and how do they wish to learn? Which software and hardware tools do our constituents use, and which of these tools do they own? What roles do our constituents play on campus?   What demographic factors identify them?   As you think about all the data we collect with the MISO Survey, from dozens of institutions, it naturally raises the question of how can the data be used for comparison both locally and nationally. Craig is going to pick up with that point.

Local Needs and National Benchmarking Balance between the needs of local customization and benchmarking Large body of tested items Few required items Local items Benchmark over time and against peers For some institutions there is a tension between needing data on specific local issues and benchmarking against peers when designing a survey MISO has a broad choice of items (almost 400) This means campuses will often will find standardized tested items that meet local needs and still allow benchmarking Relatively few required items (30?). They are required so that we have baseline items for comparison across institutions. Required items are about core IT and library services. Institutions Can add local items to fit into the survey as entirely new questions or items that fit into existing questions Benchmark over time and against peers Get results workbooks on an ongoing basis

Sampling and Response Rates Campus Survey Administrator Survey length Sampling Response Rates Great questions don’t mean anything if you can’t get responses from appropriate subjects l The campus survey administrator is critical to carrying out the survey. The CSA acts as the primary point of contact between institution and the MISO Survey Team, manages all aspects of Survey preparation at your institution, serves as the primary resource for members of your campus community regarding their participation in the Survey Training at regional meetings on a yearly basis (first time CSA/institutions strongly encourage to attend Ongoing Support from liaison Survey length is important want to find the sweet spot between number of items that can be asked vs. response rates. Frustrated users drop out of surveys and different populations have different frustration thresholds MISO supplies a Decision spread sheet keeps you honest about length of survey using easy visual cues for each population while you pick the questions to include. Sampling – you provide information and MISO does the sampling for you. Consistency and ease For most institutions Faculty and staff are the full populations. Students are random stratified samples in many cases. For small institutions it may be all students. Response rate are maximized in the survey. Custom messages – appear to come from CSA with personalized touches

Response Rates Here are the median response rates for the survey for the last 5 years. These rates are quite good for surveys of this type of survey

Instrument Development Knowledge of management team/practitioners Client feedback Local items for inclusion Questions and Items are rigorously tested Experienced senior IT and Library folks 147 years of experience on the current team. Team works on this on a weekly basis Each client has a liaison on the management team. Relevant feedback is recorded, shared and acted on as appropriate We examine local items. Items used frequently are evaluated for inclusion as part of the core survey Rigorous testing - human testing focus groups and analytical testing factor analysis

Delivering Results Detailed Data for Institution Results Workbooks MISO Survey Results Website (coming soon) Detailed identified data for each population for your institution. SPSS, CSV PDF including summaries Results Workbooks (excel) allow clients to pick peer groups to benchmark against. Convenient for non-statisticians and/or quick analyses Also benchmark across time. Displays Number of responses, means, standard deviation, and statistical difference tests Can see aggregate data by year MISO Survey Results Website coming soon. Data set getting big to handle workbooks. Web based system for delivering the same functionality as the results workbook above (not detailed identified data) You can do all kinds of interesting local analysis and national research with the results the next 2 papers are going give examples of both