WOREP: What we learned at CSUF?

Slides:



Advertisements
Similar presentations
LibQUAL+ in the UK & Ireland: five years experience J. Stephen Town and Selena Lock, Cranfield University.
Advertisements

October 28, 2004 Friends of Mabee Library Analysis of WOREP Results for Mabee Library.
Navigation Elements On Library Websites: What Works Best? Lesley Moyo Gateway Librarian Penn State University Libraries USA.
Experiments and Observational Studies.  A study at a high school in California compared academic performance of music students with that of non-music.
Human Computer Interaction CS/ CM 348 Fall 2010 Class Camille Riviere.
October 28, 2004 Friends of Mabee Library. October 28, 2004 Friends of Mabee Library Wisconsin Ohio Reference Evaluation Program Presented by Heather.
How to begin writing research papers. GETTING STARTED.
CR TOOLKIT WORKSHOP COMMUNITY ACTION PLANS (CAP) Ref- ICMM CD TOOLKIT # 16 TRAINNER: YON BUHUYANA DATE:08 TH NOVEMBER,
Performance Measurement: How Is Data Used in Quality Improvement ? Title I Mental Health Providers Quality Learning Network Quality Learning Network Johanna.
Using LibQUAL+ to Rethink Public Services June 2003.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Activating Mental Velcro: Facilitating Learning That Sticks!
Introduction to Survey Research
Outline of Quality assurance and accreditation
Educational Communication & E-learning
The "Yesable" Proposition
Standard Level Diploma
Assessment in student life
A Case Study in Accessing Hard-to-Reach Populations
Part Two.
BSc Computing and Information Systems Module: M2X8630 Research and Development Methods Introduction to Research Methods.
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
2017 Winter Employee Engagement Survey
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Open journal systems and undergraduate research
An agency of the Office of the Secretary of Education and the Arts
Paulina Liżewska Paweł Kamiński
MTSS Problem Solving Committee for Secondary Schools
NKU Executive Doctoral Co-hort August, 2012 Dot Perkins
The Lycurgus Group: Instructional Effectiveness Survey System
Dr Sambo Zulu [Leeds Beckett University, UK]
What are the impact of changes to lesson observation and feedback processes on teachers’ attitudes and practice? Why this? The new process. What did we.
IENG 451 / 452 Voice of the Customer: Analysis (KANO, CTQ)
Logo slide English/Arabic
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Online marketing is undoubtedly a great way to grow your business and generate more profits. The latest statistics confirm that a huge number of people.
LibQUAL+ in the UK & Ireland: five years experience
Learning Forward Annual Conference Session F28
Evaluation of UCI’s Library Website
CCRS Implementation Team Meeting September, 2013
© 2012 The McGraw-Hill Companies, Inc.
The Community as a Client: Assessment and Diagnosis
Curriculum and Instruction BOOTCAMP! CCRS Session
and for the theatre community
From the Ground Up: Getting Everyone Thinking About Assessment
Safety Culture Self-Assessment Methodology
MDI MOVING TO ACTION with MDI Results
Starter Imagine - you did not do as well as you wanted to in a biology test, but your teacher praises you for working hard and trying your best. You feel.
Structured Trade and Commodity Finance Advisory services
Stephanie Curry—Reedley College Dolores Davison—Area B Representative
CCRS Implementation Team Meeting Leadership Session
Balancing Administrative & Clinical Supervision
Final Research Question
The WPA WHAT IS THE WPA? Every campus in the CSU is subject to the Graduate Writing Assessment requirement, or GWAR. At SDSU, students take the WPA, or.
Your Library: Explore, Learn, Read, Connect
Social Media in Schools
Chapter 4: Designing Studies
Performance Management
Impact of the project Excellence in research
Kuali Research Organizational Change Management
The project partners and their types
TECHNOLOGY ASSESSMENT
and for the theatre community
THE ROLE OF THE LIBRARY MEDIA SPECIALIST
European Institute of Public Administration (NL)
Fahrig, R. SI Reorg Presentation: DCSI
10/22/ A Observational Studies and Experiments.
Different approaches for different questions…
Chapter 1 Assessment Basics
Member Survey of Process: Ratings of Satisfaction
Presentation transcript:

WOREP: What we learned at CSUF? Will Breitbach California State University, Fullerton

Why WOREP? Reference assessment Paired survey Standardized Low cost Having knowledge about the quality of your reference service may give you ideas about areas of potential improvement Designed specifically for in-person reference service – unlike some other popular tools like LibQUAL A reference assessment that attempts to gather information about both service and success – attempts to go beyond a simple “satisfaction” rating. A paired survey that obtains success and satisfaction information as well as descriptive data from the patron and a great deal of descriptive data from the librarian – including detailed description of the question – this is one aspect of the survey that was really appealing to us. We thought the paired survey would allow us to see if librarian and patron viewed the reference transaction in the same way. In essence, did we understand our users? Expert designed standardized questionnaire, that is processed by WOREP administrators and provides data by which you can compare your institution to others. 136 Academic libraries have used the instrument with a total of 13448 transactions assessed. Low cost – $1.25 per survey

Preparation Sampling Strategy Survey procedure Training - website Different sampling strategy – every patron v. first 2 patrons per sampling hour – most importantly you want a representative sample and you want to avoid sampling bias. that is you want the sample to represent the real world (represent busy and slow times). Ask for participation upon the conclusion of transaction. You want to ask at the end of the survey because you don’t want the survey to impact the quality of service or the survey responses. Training is required – everyone on the ref team went through the training session Most of the training effort focused on the subject coding. We selected a small group of librarians to do an initial “norming” of the coding procedure.

We are not horrible! We learned… Actually, we learned that we did well in comparison to top scoring institutions. That in and of itself is good to know. We had hoped to get detailed subject level information so we could identify subject areas where training was needed etc. That did not happen.

The first number here represents the number of successful transactions per subject and the second number is the total number of questions per subject. The third is the percentage. The sample size for individual subjects is too small in most cases to give us data we can use to develop training. That was a bit of a disappointment. We thought the subject level data would be among the most valuable information in the survey. The studies I read don’t focus on this data, likely because the got the same results.

Success & Satisfaction

Success & Satisfaction

Learning

Learning sources

Learned about library

Potential concerns

Satisfaction v. Time

Satisfaction v. # of Sources

Lessons learned Sample size Sample method Subject coding