Lynn Berard / Bella Gerlich The READ Scale G. Lynn Berard Bella Karr Gerlich.

Slides:



Advertisements
Similar presentations
WHO Antenatal Course Preparing the new WHO eProfessors.
Advertisements

Aug 26, By the end of this presentation parents will be able to understand and explain to others in the WIS community: -the complexities of the.
Lindsey Main 1, 2 Lindsey Main 1, 2 Kathleen McGraw 2 Kathleen McGraw 2 User Services Department at UNC Chapel Hill Health Sciences Library  supports.
1 Your Quality Enhancement Plan (QEP) Associate Professor Sarah Thomason & Karen Brunner, Asst. VP for Institutional Effectiveness & Research August 29,
NextGen Reference: Single Service Points and Tiered Reference in Academic Libraries Jeff Lacy Lamar University
Student Success Initiatives Supplemental Instruction Leaders Learning Communities Success in College and Life Course Fall 2007.
Purpose of Evaluation  Make decisions concerning continuing employment, assignment and advancement  Improve services for students  Appraise the educator’s.
NWACC Library Instruction Program Teaching information literacy skills for academic success and lifelong learning.
Open Library Environment Designing technology for the way libraries really work November 19, 2008 ~ ASERL, Atlanta Lynne O’Brien Director, Academic Technology.
EPIC Online Publishing Use and Costs Evaluation Program.
Performance Development Plan (PDP) Training
How Assessment Will Inform Our Future 1. Administration of on-going user surveys and focus groups to enhance reference services 2. Analysis of LibStats.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
Improving Students’ understanding of Feedback
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
Application of Ethical Principles During the Informed Consent Process for Clinical Trials Barbara E. Barnes, MD, MS Joanne Russell, MPPM Maurice Clifton,
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Data on Student Learning Office of Assessment University of Kentucky.
May 18, Two Goals 1. Become knowledgeable about the QEP. 2. Consider your role in making the QEP a success.
Blended Courses: How to have the best of both worlds in higher education By Susan C. Slowey.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Cumberland County: May 28 Oak Ridge: June 2 Roane County: June 4 Scott: June 4 Campbell: June 9 Knox: June 10 Loudon: June 11.
Quality Improvement Prepeared By Dr: Manal Moussa.
Information Competency: an overview Prepared by: Erlinda Estrada Judie Smith Mission College Library Santa Clara, CA.
1 Evaluation. 2 Evaluating The Organization Effective evaluation begins at the organizational level. It starts with a strategic plan that has been carefully.
Connecting Work and Academics: How Students and Employers Benefit.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Principles of Assessment
Business and Management Research
How to Make a Survey.
TEACHING EVALUATION CAN BE A ONE DISH MEAL Heather Campbell Brescia University College London, Ontario, Canada
BACK TO THE BASICS: Library Instruction Redux. BRENT HUSHER MELISSA MUTH FU ZHU0 University of Missouri–Kansas.
The Integration of Embedded Librarians at Tuskegee University Juanita M. Roberts Director Library Services Ford Motor Company Library/Learning Resources.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Partnerships for student success: Integrated development of academic and information literacies across disciplines Bev Kokkinn & Cathy Mahar Learning &
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
Updated Performance Management for Exempt Staff Fall 2009.
Preceptor Orientation
APS Teacher Evaluation Module 9 Part B: Summative Ratings.
Created by: Christopher J. Messier Learning Commons Supervisor.
WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D.
McGraw-Hill/Irwin Teaching Excellence Project funded by CELT Teaching Economics through Innovative Content and Effective Teaching Methods Necati Aydin,
1 Software Process Models-ii Presented By; Mehwish Shafiq.
Usability, the User Experience & Interface Design: The Role of Reference July 30, 2013.
Learning from Pharmacy Distance Learners by By Rae Jesano, MSLS, AHIP, Linda Butson, MLn, MPH, AHIP, Mary Edwards, MSLIS, Health Science Center Libraries,
Module 5: Data Collection. This training session contains information regarding: Audit Cycle Begins Audit Cycle Begins Questionnaire Administration Questionnaire.
College Library Statistics: Under Review Teresa A. Fishel Macalester College Iowa Private Academic Libraries March 22, 2007 Mount Mercy College, Iowa.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
SHORTER COLLEGE Assessment Week Sponsored by the Office of Institutional Effectiveness and Assessment & the Division of Academic Affairs.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
Barbara F. Schloman, Ph.D., TRAILS Project Director Julie A. Gedeon, Ph.D., TRAILS Assessment Coordinator Kent State University Libraries and Media Services.
Student Preferences For Learning College Algebra in a Web Enhanced Environment Dr. Laura J. Pyzdrowski, Pre-Collegiate Mathematics Coordinator Institute.
Lawrence University and the Seeley G. Mudd Library Private undergraduate college of the liberal arts and sciences with a conservatory of music 1450 students,
Charting Library Service Quality Sheri Downer Auburn University Libraries.
Assessing current print periodical usage for collection development Gracemary Smulewitz Distributed Technical Services Rutgers University Libraries.
Daniel G. Tracy and Susan E. Searing University Library, University of Illinois at Urbana-Champaign Perception and Use of Academic Library Services by.
Tracking Reference Transactions at Rider University 13 th Annual VALE/NJ ACRL/NJLA CUS Users Conference January 5, 2012 Pat Dawson Science Librarian Moore.
Instructional Leadership: Applying Concern & Use Name Workshop Facilitator.
1 Chapter 2 SW Process Models. 2 Objectives  Understand various process models  Understand the pros and cons of each model  Evaluate the applicability.
Taeho Yu, Ph.D. Ana R. Abad-Jorge, Ed.D., M.S., RDN Kevin Lucey, M.M. Examining the Relationships Between Level of Students’ Perceived Presence and Academic.
Using the READ Scale (Reference Effort Assessment Data) Capturing Qualitative Statistics for Meaningful Reference Assessment.
CREATING A SURVEY. What is a survey questionnaire? Survey questionnaires present a set of questions to a subject who with his/her responses will provide.
Research Methods for Business Students
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Sarah Lucchesi Learning Services Librarian
Benchmarking Reference Data Collection
Evaluation Measures, Ongoing Improvements and Enhancement
Presentation transcript:

Lynn Berard / Bella Gerlich The READ Scale G. Lynn Berard Bella Karr Gerlich

Lynn Berard / Bella Gerlich The READ Scale Reference Effort Assessment Data A six point (1 - 6) sliding scale that asks librarians to assign a number based on effort / knowledge / skill / teachable moment instead of a hash mark after a reference transaction.

Lynn Berard / Bella Gerlich Objective Test the validity of the READ Scale as an additional tool for gathering reference statistics to record and recognize the effort / knowledge / skills / value-added service required during a reference transaction. READ Scale Study

Lynn Berard / Bella Gerlich READ Scale Study Recruit academic libraries Public and private Geographically diverse Various enrollment size

Lynn Berard / Bella Gerlich 14 Institutions, 12 States

Lynn Berard / Bella Gerlich The Institutions Enrollment under 5,000 Eastern Virginia Medical School Edward E. Brickell Medical Sciences Library, Norfolk, VA Lawrence University Seeley G. Mudd Library, Appleton, WI Lewis & Clark College Aubrey R. Watzek Library, Portland, OR Clarke College Clarke College Library, Dubuque, IA Our Lady of the Lake University San Antonio (OLLUSA) Sueltenfuss Library, San Antonio, TX

Lynn Berard / Bella Gerlich The Institutions Enrollment over 5,000 Georgia College & State University GCSU Library & Instructional Technology Center (2 Service Points) Reference & Special Collections, Milledgeville, GA Carnegie Mellon University Hunt Library, Arts,& Spec Coll, Science Library, Mellon Inst, Music Listening, Software Engineering Institute (6 Service Points) Pittsburgh, PA Robert Morris University Patrick Henry Center, Pittsburgh Center (2 Service Points) Moon Township, PA Washburn University Mabee Library, Topeka, KS

Lynn Berard / Bella Gerlich The Institutions Enrollment over 15,000 West Virginia University (3 Libraries) Health Sciences Library, Downtown Campus Library, Evansdale Library, Morgantown, WV University of California, San Diego Science & Engineering Library, La Jolla, CA New York University Business & Documents Center – Bobst Library, New York, NY Georgia Institute of Technology Georgia Tech Library, Atlanta, GA University of Nebraska Love Library (Chat Service only), Lincoln, NE

Lynn Berard / Bella Gerlich Participant Data 14 Institutions 24 Service Points 179 Participants Full / PT / Faculty / Staff Varying experience levels

Lynn Berard / Bella Gerlich Study Components IRB / Consent Forms Timeline (3 week and/or semester long) Pre-test / local calibration Blog Online Survey

Lynn Berard / Bella Gerlich Study Components - IRB IRB (Institutional Review Board) approval Done at GCSU and Carnegie Mellon Done at Institutions if required Consent forms delivered electronically, signed & returned with data at end of 3 week study period

Lynn Berard / Bella Gerlich Study Components - Timeline 3 Week and / or Full Semester Feb. 2 - Feb. 24, Institutions (all) Full Semester - 7 Institutions February dates selected to give institutions time to test as well as minimize chances for spring break, holidays etc.

Lynn Berard / Bella Gerlich Study Components - Pre-test, Local Calibration Sample Questions created to choose from Encouraged to include questions typical to their home institution (ie collection specific) On-Site Coordinator distributed locally and calibrated, creating an ‘example key’ for participants Asked to record time for each during test phase

Lynn Berard / Bella Gerlich Study Components - Pre-test, Local Calibration Pre-test allowed for study-wide calibration - test questions, responses, time and READ Scale category assignments were the same across institutions for the most part Institutions used their own recording sheets

Lynn Berard / Bella Gerlich Study Components - Blog Blog set up Only one question received during study Online survey responses suggest that the READ Scale was easy to apply, could explain why blog was not utilized

Lynn Berard / Bella Gerlich Study Components - Online Survey Online Survey sent at the conclusion of the three week study period to all participants Response rate was high responses out of 179, or 56%

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 1: Please rank your degree of difficulty using the READ Scale. ResponsesResponses not difficultsomewhat difficult moderately difficult difficultvery difficult Skipped Question Number responded 52 (51.0%)38 (37.3%)10 (9.8%) 2 (2.0%) 00102

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 2: Was the READ Scale easy to apply? ResponsesResponses very easy to apply easy to apply moderately easy somewhat easy not easySkipped Question Number responded 16 (15.7%) 39 (38.2%) 38 (37.3%) 8 (7.8%) 1 (1.00%) 0102

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 3: Please rank the level of perceived "added value" the READ Scale placed on statistics gathering for reference transactions. ResponsesResponses extreme value added high value added moderate value added minimal value added no value added Skipped Question Number responded 7 (6.9%) 46 (45.5%) 35 (34.7%) 9 (8.9%) 4 (4.0%) 1 (.99%) 101

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 4: Did you have difficulty in deciding between ratings? If so check all that apply. ScaleScale no difficulty Res p Tot al Ski p RespResp 12 (7.6%) 32 (20.4%) 46 (29.3%) 31 (19.7%) 15 (9.6%) 21 (13.4%)

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 5: How did you feel about evaluating your own efforts? ResponsesResponses Extremely Comfortable VeryMod.Min.Not Skipped Question Number responded 12 (11.9%) 50 (49.5%) 35 (34.7%) 4 (4.0%) 0 (0%) 1 (.99%)101

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 6: Would you recommend the READ Scale to another reference librarian? Response Response Total Yes68 (67.3%) No12 (11.9%) Yes, but with some modifications 21 (20.8%) Skipped Question1 Number responded101

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 7: Would you like to see this scale adopted for use in your library? ResponseResponse Total Yes50 (50.5%) No18 (18.2%) Yes, but with some modifications 31 (31.3%) Skipped Question3 (2.97%) Number responded99

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results - Likes Question 8 asked participants what they liked about the Scale. The likes listed by the participants where coded into the six most common reoccurrences: Effort / Value; (17) Approach to Evaluation; (13) Types / Levels; (9) Time; (5) Staffing Levels; (6) Reporting to Administration (5)

Lynn Berard / Bella Gerlich Likes “It gave me a quick visible check of my recent efforts. This made my desk work more rewarding, since I sometimes feel like I do so many 1s and 2s- but I could see that I was actually doing a higher level of reference than I realized. It added value to the statistics - literally.”

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results - Dislikes Question 9: Participants were also asked to list their dislikes. These were coded into the six most common reoccurrences: Difficult to Apply / Subjectivity; (19) Types / Levels; (16) Approach to Evaluating; (9) Knowledge of Staff; (6) Effort / Value; (4)

Lynn Berard / Bella Gerlich Dislikes “At times it was difficult to rate effort.”

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 10: Do you have any modifications you would like to suggest to improve the READ Scale? ResponseResponse Total No67 (72.8%) Yes (please describe)25 (27.2%) Total Respondents92 (90.1%) Skipped Question10 (9.90%)

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results - Modifications Modifications were put into the following categories: Delivery Method/READ Scale Appearance; (9) Time Element; (5) Skill Level Element; (4) Clarity of Categories; (4) Discussion Component; (2) Comments / Observations. (2)

Lynn Berard / Bella Gerlich Modifications “Perhaps, clarify who is answering the questions” (skill level). “Collect length of time as another way to gauge level of difficulty”

Lynn Berard / Bella Gerlich Study Components - Online Survey - Results Question 11: Were there any changes in your personal approach to reference service while you were using the READ Scale? ResponsesResponse Total No88 (89.8%) Yes (please describe)10 (10.2%) Total Respondents98 (96.0%) Skipped Question4 (3.9%)

Lynn Berard / Bella Gerlich Personal approaches “More likely to think about the level of service being provided.” “I gave more conscience thought to the processes or steps involved in order to rate each interaction.”

Lynn Berard / Bella Gerlich Survey Results - Overall No difficulty using the Scale Easy to apply Ranked perception of added value to reference statistics as ‘high’ Staff comfortable with rating their own efforts 68% would recommend as is, 20% with modifications 50% would adopt as is, additional 31% with modifications Low percentage of changes in approach (10%) reinforces ease of use / local adaptability

Lynn Berard / Bella Gerlich READ Scale - What works Local approach to using the READ Scale Pre-testing / common questions Easy to use Adds value to data gathering Adds value to work / satisfaction Records previously unrecorded effort / knowledge / skills (service point and off-desk)

Lynn Berard / Bella Gerlich READ Scale - preliminary data Comparisons per service point (READ Scale) Comparisons off-desk (READ Scale) Approach type, service points Approach type, off-desk

Lynn Berard / Bella Gerlich Three Week Data, Service Points

Lynn Berard / Bella Gerlich Three Week Data, Off-Desk

Lynn Berard / Bella Gerlich Full Semester Participants Data, Service Points

Lynn Berard / Bella Gerlich Full Semester Participants Data, Off-Desk

Lynn Berard / Bella Gerlich

READ Scale Practical Applications Training / Continuing Education Renewed Personal & Professional Interest Outreach Reporting / Statistics

Lynn Berard / Bella Gerlich READ Scale Practical Applications Training / Continuing Education “I felt it was very useful because it challenged me to come up higher in those areas where I need improvement in certain concentrations like ____ which is not my specialty. I need to learn so much more.”

Lynn Berard / Bella Gerlich READ Scale Practical Applications Training / Continuing Education New staff. Develop a training regimen with outcomes, with a similar series of questions to be given at a later date to insure that the staff is developing the necessary skills / knowledge. Continuous learning. Writing down any questions that elicit an assignment of a category of 4 or higher at the service point, then sharing with colleagues, there by sharing strategies and learning from others.

Lynn Berard / Bella Gerlich READ Scale Practical Applications Renewed Personal & Professional Interest “Using the READ Scale added to my sense of accomplishment!”

Lynn Berard / Bella Gerlich READ Scale Practical Applications Renewed Personal & Professional Interest Self Assessment / Reference as Activity Reference staff can rate their effort / knowledge / skills as appropriate. Gives recognition to primary function and can be compared to other libraries / librarians using the Scale. Acknowledges the two activities most important to reference staff in terms of job satisfaction (Gerlich): helping users and detective work.

Lynn Berard / Bella Gerlich READ Scale Practical Applications Outreach “It gives ME a tangible scale on which to rate my efforts, ultimately spurring me to strive for better service.”

Lynn Berard / Bella Gerlich READ Scale Practical Applications Outreach Recording Liaison Activity Off-desk statistics are often not recorded, or if they are, given the same hash mark as a directional question. Using the READ Scale in these cases would show case the subject specialization knowledge / needs of the campus. In cases where off- desk statistics are low or READ Scale assignments are in the low end range, outreach activity could be re- examined, surveys taken, etc in that particular area and services redesigned as needed.

Lynn Berard / Bella Gerlich READ Scale Practical Applications Research / Statistics “An assessment tool that does a better job of reflecting how reference librarians spend their time. It gives more value than tick marks on a page. It's a tool we can use with administrators to show what we really do.”

Lynn Berard / Bella Gerlich READ Scale Practical Applications Research / Statistics Staffing Strategies. Who staffs desk, when. Develop Narrative Statistics. Records hidden work. Time. Estimate or actual real time statistics for effort working with patrons. Comparisons with like institutions who use the Scale.

Lynn Berard / Bella Gerlich For further exploration Discussion must be part of implementation / continued use - integrate with Scale Categories to include more clear examples that emphasize effort / teachable moment Consider alternate format easier to read (bulleted or rubric form) Time element more prominent and / or recorded for sample periods More clear distinction between categories or consider fewer categories Consider ways to identify levels of skill between staff at service points

Lynn Berard / Bella Gerlich Next Steps Consider modifications based on feedback Publish articles Creative Commons Invite other Libraries to use Continue data gathering Create communication tool for users / continued development of Scale

Lynn Berard / Bella Gerlich Questions? Thank you! If you are interested in trying the READ Scale, please contact us at: Lynn Berard, Bella Gerlich,