Download presentation
Presentation is loading. Please wait.
Published byDella Harmon Modified over 9 years ago
1
Cross-institutional Repository Assessment: A Standardized Model for Institutional Research Assessment Robert H. McDonald Indiana University Charles Thomas Institute for Museum and Library Services
2
Outline I.Introduction II.The Need to Measure & Compare III.New Candidate Frameworks IV.Representing New Metrics V.Future Evolution of Institutional Repositories and Evaluation
3
Institutional Assessment Needs "We have found a remarkable shortage of clear, accessible information about crucial aspects of American colleges and universities...this lack of useful data and accountability hinders policymakers and the public...and prevents higher education from demonstrating its contribution to the public good.“ A Test of Leadership: Charting the Future of U.S. Higher Education (2006) – U.S. Department of Education
4
Institutional Repositories: A Silver Bullet? "An institutional repository concentrates the intellectual product created by a university's researchers, making it easier to demonstrate its scientific, social and financial value. Thus, institutional repositories complement existing metrics for gauging productivity and prestige...this demonstration of value can translate into tangible benefits, including the funding...that derives in part from an institution's status and reputation." Raym Crow (2002). The Case for Institutional Repositories.
5
Repositories Vary In what they contain; who funds and administers each; underlying legal, social and policy infrastructure for each repository; who contributes to the repository; and motivations for contributing, whether they be mandates, disciplinary cultural norms, or other incentives
6
Current Repository Categories Institutional Disciplinary Other (preservation, publishers, etc.) How do you tell the difference? How do you know who contributes what to which?
7
Need to Evaluate IRs How do we evaluate IRs? Institutional, disciplinary, etc. exist for different purposes, probably need different evaluative frameworks
8
Need to Evaluate IRs We can’t just measure our IR as a stand- alone phenomenon, We need to be able to compare IRs We also need to evaluate IRs for their utility in overall Institutional Assessment
9
Library Assessment Needs 20 th Century vs 21 st Century –Moving Beyond Silos of Knowledge –Facilities are not an adequate measuring stick –Qualitative and Quantitative Measurement Principles are Required
10
Frameworks for IR Evaluation Proudman, V. (2008). The population of repositories. –Policies; –Organization; –Mechanisms and influences for populating repositories; –Services; –Advocacy & communication; –Legal issues
11
Frameworks for IR Evaluation Westell, M. (2006). Institutional repositories: Proposed indicators of success –Repository mandate; –Integration with institutional planning; –Funding model; –Relationship with digitization centers; –Interoperation; –Content measurement; –Promotion; –Preservation strategy
12
Frameworks for IR Evaluation Kim, H. H. and Kim, Y. H. (2007). An evaluation model for the national consortium of institutional repositories of Korean universities. –Content (Diversity, Currency, Size, Metadata) –System and network (Interoperability, Use of help services like FAQ and Q&A) –Use, users and submitters (Use ratio, User satisfaction, Submitter satisfaction, User/Submitter support) –Management and policy (Budget, Staffing, Library awareness of Open Access and related issues, Copyright management, IR Marketing, Institutional support, Policies and procedures in place, Diversity of archiving methods)
13
What Are We Seeing? Lots of Case Studies Many Qualitative Evaluative Criteria Tips, Best Practices for Good Repositories Not Much Quantitative Data –Warning, Administrators Love Numbers!!!
14
Library Assessment Needs “Key Aspects of collaborative relations may be described only in qualitative terms in the future.”* –Cross-Institutional Shared Digital Collections –Intra-Institutional IR Collection Building IR Assessment Institutional Research Assessment *From Reshaping ARL Statistics to Capture the New Environment (2008) – Kyrillidou
15
So How Do We Mix Qualitative/Quantitative?
16
The Color Palette Metaphor Absence of color = Absence of Foundations for Success-indication of early forming or orphan IR
17
The Color Palette Metaphor White = max combo of entire spectrum=Ideal IR with full suite of necessary support
18
The Color Palette Metaphor Shades of Gray or other color attributes indicate a rising IR
19
Future Evolution Institutional Measurement Institutional Research a role for Libraries Libraries as Publisher
20
From Educause Review 43(1)
21
Administrative ERP Stack
22
Fusion or Data Mining Where does the.EDU stack come together for analysis? Can the library play a role in this analysis? Needed for owned and leased assets
23
Intra-Institutional Assessment
24
IRStats
25
Digital Measures – Activity Insight
26
U Penn Data Farm
27
Layers of Assessment Comparison International Comparison National Comparison National Accreditation Regional Accreditation State and Regional Collaboration/Funding Internal Collaboration/Funding
28
Missing Link IR Assessment –Quantitative –Qualitative –Viable or Useful Mixed Visualizations
29
CONTACT INFORMATION Robert H. McDonald –mcdonald@sdsc.edu –AIM/mcdonald@sdsc.edu –Skype/rhmcdonald Chuck Thomas –chas.thomas@gmail.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.