Download presentation
Presentation is loading. Please wait.
Published byGavin Jefferson Modified over 6 years ago
1
Toni Olshen OCUL VDX Project Meeting February 25, 2004
RACER Evaluation Toni Olshen OCUL VDX Project Meeting February 25, 2004
2
Evaluating Success Evaluating 2 projects-Scholars Portal and RACER from user and staff points of view Use a mix of quantitative and qualitative tools for a richer assessment Are OII projects improving research services and helping better utilize OCUL collections? Does Scholars Portal and RACER meet OCUL user and staff expectations?
3
What do we need to learn? Do researchers use the service?
Who are they? Why do they use the service? What are their views on the quality of the service provided? How do Scholars Portal and RACER impact research, teaching and learning?
4
Key Questions What are the factors that have made the implementation of these projects more successful in some institutions than in others? What are the areas that can be identified for improvement?
5
Evaluation Report- January 2005
The report of the assessments will provide evidence to support continued funding from individual institutions as well as indicate where funds should be directed to meet ongoing OII objectives.
6
Usage statistics and surveys-Quantitative Data
For the RACER project see scholarsportal/vdx/support/project -info/surveys/
7
Qualitative data We need to add qualitative information gained from:
Internal customers (staff) External customers (faculty,graduate and undergraduate students) Utilizing high-touch techniques such as focus groups and point-of-use tools.
8
Data Gathering Techniques RACER Evaluation
Focus groups - staff, graduate students, undergrads, faculty Complex questions can be explored More in-depth responses are obtained Group interactions generate information that is not otherwise obtainable
9
Data Gathering Techniques RACER Evaluation
Spot comment cards for delivered materials Feedback on individual transactions Measure the customer's reaction immediately following the transaction Identify problem areas and plan for improvement
10
Data Gathering Techniques RACER Evaluation
Pop-up web surveys – External Customers At point of service Data collected with little effort and low cost Large samples do not cost more than smaller ones. Data can be collected at specific times
11
Data Gathering Techniques RACER Evaluation
surveys – Internal Customers - Staff used to using Reach a geographically diverse group Fast and cheap Likelihood of a prompt response is high
12
Next steps? We need to hear your ideas.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.