Assessing the Assessment Tool

Slides:



Advertisements
Similar presentations
COUNTER: improving usage statistics Peter Shepherd Director COUNTER December 2006.
Advertisements

HR SERVICE REQUEST SYSTEM Department Demonstrations February 2012.
Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department of Defense © 1998 by Carnegie Mellon.
Oct 31, 2000Database Management -- Fall R. Larson Database Management: Introduction to Terms and Concepts University of California, Berkeley School.
Manager Desktop & Supervisor ID UL Meeting December 15, 2006.
Creating a User-Centered Culture of Assessment Stella Bentley and Bill Myers University of Kansas EDUCAUSE Southwest Regional Conference 2005.
Title I Needs Assessment and Program Evaluation
Using Millennium Statistics and Web Management Reports Jennifer Parsons Systems Librarian.
LibQUAL + Surveying the Library’s Users Supervisor’s Meeting March 17, 2004.
Cataloging in digital age Li Sun Asian Languages Cataloger Metadata Librarian Cataloging and Metadata Services Rutgers University Libraries CEAL Annual.
Ihr Logo Data Explorer - A data profiling tool. Your Logo Agenda  Introduction  Existing System  Limitations of Existing System  Proposed Solution.
The Assessment Environment in North American Research Libraries Stephanie Wright, University of Washington Lynda S. White, University of Virginia 7th Northumbria.
Date: 07/20/2011 HCSIS, Independent Monitoring for Quality (IM4Q) - Considerations.
Research Data Management Services Katherine McNeill Social Sciences Librarians Boot Camp June 1, 2012.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
ZLOT Prototype Assessment John Carlo Bertot Associate Professor School of Information Studies Florida State University.
The Literature Search and Background of the Problem.
Creating Interactive Course Assignment Pages The OSU Libraries ICAP Project Coalition for Networked Information Fall 2007 Task Force Meeting Kim Griggs,
Digital Commons & Open Access Repositories Johanna Bristow, Strategic Marketing Manager APBSLG Libraries: September 2006.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
PACSCL Consortial Survey Initiative Group Training Session February 12, 2008 at The Historical Society of Pennsylvania.
ERead and Report. What is... Independent eBook Reading with a Vocabulary and Comprehension Assessment Focuses mainly on Reading Informational Texts Aligns.
1 The United Nations Demographic Yearbook and the Work Programme for Social Statistics Expert Group Meeting to Review the United Nations Demographic Yearbook.
3rd Knowledge Bank Workshop 31 มกราคม 2551 โดย สำนักหอสมุด มหาวิทยาลัยศรี ปทุม
Your LibQUAL+ ® Community: A Results Meeting American Library Association (ALA) Annual Conference Washington, DC June 25, 2007 Martha Kyrillidou, Director.
LibQUAL+ ® Survey Results Presented by: Martha Kyrillidou Senior Director, Statistics and Service Quality Programs Association of Research.
Wasted Words? Current Trends in CD Policies Matt Torrence / Audrey Powers / Megan Sheffield University of South Florida Charleston Conference 2012.
Systems/Web Services Digital Libraries & Technical Services
Academic Library Streaming Video Revisited deg farrelly – Arizona State University Jane Hutchison Surdi – William Paterson University American Library.
YOU CAN DIY (DO IT YOURSELF). LEARNING FROM BEST PRACTICES IN LIBRARY ASSESSMENT Nancy Turner, Syracuse University Library UNYOC 2010 Annual Conference.
Management Information Systems by Prof. Park Kyung-Hye Chapter 7 (8th Week) Databases and Data Warehouses 07.
Coastal Carolina University
Project 1 Introduction to HTML.
FAST at the British Library
ER&L 2010, Austin, TX February, 2, 2010
IS 130 Information systems 1
DLI Website.
Data Collection and Beyond Assessment at the U.Va. Library
Library Assessment Tools & Technology
Evolving Academic Computing Offerings: A Successful Strategy
Database Systems Chapter 3 1.
Database Management:.
The Literature Search and Background of the Problem
The revised Periodic Reporting Questionnaires: general features Alessandra Borchi Policy and Statutory Meetings Section UNESCO World Heritage Centre.
CCC Library Strategic Plan
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
USING CARLI DIGITAL COLLECTIONS
Overview and Introduction to Alma Analytics
Carolyn DeLuca Electronic Resources Librarian
Chapter 5 Data Resource Management.
ReCAP Data Part 2: Requests
Reporting Based on Data in Archivists’ Toolkit
Standard Scripts Project 2
Enhancing ICPSR metadata with DDI-Lifecycle
SWITZERLAND International Marketing and Output Database Conference
Sam Houston State University
ReCAP Data Part 2: Requests
Learning organization activity 1
Mendeley Overview VISHAL GUPTA Customer Consultant South Asia
Database Design Hacettepe University
District Test Coordinators Training
Metadata The metadata contains
Qualtrics for data collection
LibQUAL+® Survey Results
Your LibQUAL+® Community: A Results Meeting
Standard Scripts Project 2
Sam Houston State University
LibQual+ Survey Results 2002
Standard Scripts Project 2
Mendeley Overview VISHAL GUPTA Customer Consultant South Asia
Presentation transcript:

Assessing the Assessment Tool ENY/ACRL Annual Spring Conference May 21, 2012 Assessing the Assessment Tool: Development and User Testing of the Library’s Data Repository Thank you. I’m going to be describing a work in progress – at Syracuse University we are developing a repository for assessment data to improve the accessibility and visibility of library metrics data for use by library staff.

Build it and they will come? The goal for the Library Statistics Document Center was to make available in a central location the documents, data and assessment reports generated by the Program Management Center. PMC is the unit in the Library that manages projects, conducts assessment, and coordinates data collection and presentation for the Library. We believe that an easy-to-use repository for metrics and assessment reports supports a culture of data-driven decision-making throughout the Library.

Prior to the Center, data was stored on local departmental drives throughout the library – inaccessible to most staff and used primarily for operational purposes. Today we have a central, web-based document center that provides improved capacity to access data and assessment reports across the functional service areas, and to connect up our library data with student and faculty demographics from the University data warehouse Data processing is centralized as well - the Program Management Center receives data requests, collects the data and has responsibility for regular reporting as well as special assessment projects. Centralization

Define Scope ARL Statistics Circulation Reports Usability Studies LibQUAL Collection Profiles Gate Counts Reference Statistics We generate a lot of reports, and the repository is populated with both quantitative and qualitative data There are numbers-based reports, like circulation or instruction statistics, but also textual data like comments from student surveys, usability test results, interview summaries.

Metadata Data elements in File (Key Fields) Title, Author, BIB ID, Comments, Month, Year, Count Dates of Coverage 2011, 2007 Date Type Fiscal Year, Calendar Year, One Time Description/Title Use Natural language! Data Source Voyager, ExLibris, Vendor Report Type Summary, Charts, Raw Data Method Survey, System Query This broad scope requires a metadata scheme that can adequately describe and make the contents findable. After drafting a preliminary metadata scheme, we started populating the center to determine how well the metadata worked and how intuitive it is to apply. Even in our small unit, we have one librarian, two engineers and one statistics PhD – as you can imagine, we sometimes speak different languages and use different words to describe what we’re looking for.

Workflow We established a workflow for uploading to the repository.

Versioning Including naming conventions and a versioning system.

Users Usability can be both technical requirements and ease of use. How do we expect our users to interact with the document center. Will they access this from home? Will it be open to administration, all library staff or the public? Will users be contributing documents, or just reading or downloading files from it? What are the skill sets for the users? Will you expect them to use the data raw, in summary tables or do they need power point presentations? We can not assess usability without clear understanding of who users are, how they will access the system and how they will be looking for information.

The Assessment Cycle Identify content Understand users Determine access process Develop metadata scheme Gather feedback Make changes So this is the cycle that I’ve described. We’ve done some traditional usability testing to ask users to located particular files or content. A key aspect in the cycle is continuing to be open to feedback and making changes based on that feedback.

Do They Use It? Quantitative Qualitative Meeting Goals Measure usage and number of unique users Measure decrease in custom report requests Qualitative Is the workflow operating as expected? Is the system easy to use? Meeting Goals Are users discovering the data they need to make-decisions? Are users asking more questions? Our system isn’t live yet but we are hoping to see an: Uptake in the usage Fewer questions for items or content that already exist in the center, indicating that staff are finding information independently. Ultimately, we want staff to use the data they find to ask more questions – to get them excited about the patterns they see in the data and the trends it helps us to discover and influence.

Picture Credits Center http://www.flickr.com/photos/nest_jar/3454629674/ Light http://www.flickr.com/photos/seier/3122721913/ Baby User http://www.flickr.com/photos/courosa/5536535796/ Baseball Field http://www.flickr.com/photos/philsnyder/3776582751/