Presentation is loading. Please wait.

Presentation is loading. Please wait.

Perpetual Beta: Early Literature About Institutional Repositories and What Assessment Can Tell Us Now ALCTS Institutional Repository Series Seminar Allison.

Similar presentations


Presentation on theme: "Perpetual Beta: Early Literature About Institutional Repositories and What Assessment Can Tell Us Now ALCTS Institutional Repository Series Seminar Allison."— Presentation transcript:

1 Perpetual Beta: Early Literature About Institutional Repositories and What Assessment Can Tell Us Now ALCTS Institutional Repository Series Seminar Allison Sivak, Assessment Librarian Leah Vanderjagt, Digital Repository Services Librarian University of Alberta Libraries May 19, 2010

2 The Promise of the IR Movement “While institutional repositories centralize, preserve, and make accessible an institution’s intellectual capital, at the same time they will form part of a global system of distributed, interoperable repositories that provides the foundation for a new disaggregated model of scholarly publishing.” (Crow 2002) We are not there yet, and we ask why.

3 Nature and Scope of Investigation How we approached this What we hope to offer you What limitations we express about this

4 4 Critique of Formative Visions “...many of the earlier conversations regarding IRs tended to focus on benefits to the host institutions, rather than the scholars who would use IRs.” (Choudhury 2008) Markey et al. (2008) found that user needs assessment has held only low significance in decisions by college and university libraries to initiate IRs “As Smith emphasizes […] ‘there is no consensus on what institutional repositories are for.’” (qtd. in Lercher 2008)

5 Why Assessment? Value of user-centred service Gap between library staff perceptions and users’ perceptions –e.g., untested assumptions of what users need or want Need for evidence-based decision making –Resource allocation –Justification to funders / parent institutions Tracking trends for planning purposes 5

6 Where are we at, really, with IR assessment? “one of the greatest problems that institutional repositories face, which is their general inability to provide sufficient statistical and assessment data to justify their existence” (McDonald and Thomas 2008)

7 One Measure to Rule Them All The ubiquitous “how many” Outside and within IR assessment, there is broad questioning re: how meaningful this is

8 Measures in the Literature Content Profile Volume of articles published (Bjork 2009) Amount of IR content as proportion of faculty / subject publishing Content Usage Individual article impact (Gedye, 2009) Internal system measures (download rates for pre vs post prints) Usage statistics Time from publication to citation / impact (Joint, 2009)

9 User Engagement Measures –Faculty attitudes –Sustained engagement (Carr, 2007) –Program sustainability indicators (Choudhury, 2008) –User satisfaction / usability –Business case (withdraw service, what would happen?) Bonnilla-Carrero’s (2008) “activity indicators” co-authorship index, inter-centre collaborations, positioning of author name in paper, time from deposit to first access, number of distinct countries that cite or download each document Provides the “demographic profile” of IR content properties and usage 9

10 One Metadata Schema to Rule Them All? Metadata/Interoperability –Metadata helps IRs realize their collation, dissemination, and preservation promise –To date: poor controlled vocabulary and authority control –Metadata units of analysis? Bruce and Hillmann’s (2004) quality indicators –completeness, accuracy, provenance, conformance to expectations, logical consistency and coherence, timeliness, and accessibility 10

11 What is the Unit of Analysis? Major question that determines what we can assess and how we can assess it –Limits what we can conclude from our unit of analysis e.g., count of articles in IR cannot tell us how findable those articles are, help with usability, or understand what users want from IR 11

12 12 Simple Measures or Complex Systems? “ Rather than promote IRs as a cause and effect relationship, it may be preferable to think of them in a correlational manner.” (Choudhury 2008) Kim & Kim (2009) diagram of interrelationships between factors in IRs –Procedural assessment: relationship between outputs and inputs

13 13 Procedural Assessment Factors Content Metadata / metacontent Size Diversity Currency Management & Policy Budgeting Awareness of IR Formal agreement Copyright Staffing Policies Marketing Archiving methods System and Network Interoperability Integration dCollection homepage Use, User, Submitter Use rate Submitter satisfaction Support for users and submitters User satisfaction Kim & Kim, 2008

14 14 Kim and Kim, 2008

15 15 Kim & Kim, 2008

16 Tacit Knowledge The real roles of IR managers (Zuccala et al 2008) Starting from a vision (high-level goals) and software (technological capacity), do IR managers find the practical literature most useful? Education theory: practicing teachers vs. ‘best practices’ stated in scholarship

17 Given All of This If we are going to move forward with assessment for IRs, let us therefore be clear about two things re: assessment in general –Assessment is not clearly objective –Assessment is not value-neutral 17

18 Who’s asking? Meaningful evaluation depends on who is asking –Faculty, administrator, chief librarian, IR manager, end user?

19 Why are we asking? Institutional politics Funding Resource allocation Rankings

20 “These are early days” This statement made in various ways in reviews of assessment or in reporting of results This statement is important and valid and should be considered as assessment plans are made We may need to assert the fact of ‘early days’ to others as we come to an understanding of what assessment truly means for IRs

21 21 Kim and Kim, 2008

22 “Developmentally Appropriate Measures” From all these choices, which one? All may be valid; none are necessarily ‘wrong’ Propose: A)Agreement on a common research agenda that acknowledges that IRs should be evaluated according to their developmental stage B)Standards for assessment methods so that we can ready ourselves for future meta- analysis

23 Questions and Comments


Download ppt "Perpetual Beta: Early Literature About Institutional Repositories and What Assessment Can Tell Us Now ALCTS Institutional Repository Series Seminar Allison."

Similar presentations


Ads by Google