Download presentation
Presentation is loading. Please wait.
Published byFelix Bryan Modified over 9 years ago
1
Institutional Repositories The invisible & unwritten rules of Project Management in ETD Collection Building
2
PM Life Cycle according to the textbook Feasibility study Business analysis Planning, designing & resourcing Execution Implementation Close out
3
PM the real-world perspective Core elements Scope Time Cost Quality Means to achieve Integration People Communication Risk
4
Project Risk Management risk areas Human Resources management QA Collections Equipment & IT Internet access Scheduling & rate of production
5
Project Management and people Textbook Staff acquisitions Team development What happens Urgent vs important University calendar Training & retraining
6
Quality Assurance Metadata Developing guidelines Developing subject thesauri Training Peer review Digital files Agreeing an accepted standard Random sampling of pages, graphs, appendices Integrity of digital file with print version Front & content pages
7
Collections Agree scope of collection & timelines Integrate workflow across: student & supervisor Dean & Faculty Registrar & bureaucracy Library & internal processes
8
Internet access Webpage development Skills & training Dependence on NRF Protection of content Managed access Copyright issues
9
Rate of production Project host (NRF/CHELSA) expectations Unrealistic target setting Misjudged scheduling Metadata production dependent on digital file production (Greenstone; D Space?) Impact on staff morale Impact on project progress
10
Monitoring production Spreadsheets to monitor Set targets (realistic ones) individuals weekly totals Weekly production meetings
11
Lessons learned 6 phases of a project Enthusiasm Disillusionment Panic Search for the guilty Punishment of the innocent Praise & honours for the non-participants
12
Lessons learned on the ground Allocate time for initial team training Digitisation projects are about managing production lines Pressured environment driven by budgets & timelines Direct correlation between high output and team size Anticipate varying capacity levels across team Not to be bolted onto existing workload of staff members
13
In conclusion Challenges Team development is time-consuming & demanding Role boundaries must be clear Anticipate problems Manage change Don’t become dependent on 2-3 key people
14
Institutional Repositories Evaluating the impact of the ETD collection in the institution
15
Current dilemma IRs are innovative but often marginalised technologies Difficult to demonstrate impact on the research enterprise of the university University Administrators doubt the “institutional good” without demonstrable evidence No consensus on agreed set of Performance Indicators (PI) or metrics
16
ReRR launched for the right reasons in 2006 Enhanced visibility of research outputs Increased dissemination of institutional scholarship Preservation & long-term access to institutional scholarship Opportunity to educate faculty & researchers about ©, open access publishing
17
ReRR – innovation caught between a rock… Within RU viewed as a library activity & resource Reliance on quantitative PIs to demonstrate benefit Open access publishing is not without costs Few libraries have dedicated budgets for IR operating costs
18
…and a hard place High usage evidence suggests IRs part of the “research good” – not recognised by University Administrators Research policy & decision makers unconvinced that IR is a strategic research tool Low awareness among Faculty & few incentives to use open access publishing Reliance on non-strategic PIs to evaluate
19
Quantitative PIs used to evaluate IR services Gross number of items as well as retrieval using hits & downloads Levels of active community engagement (gappy vs continuous) Time-increment measures vs one-time only counts Content material types (proportion of pre- print or post-print items)
20
Qualitative PIs used to evaluate impact Fit between IR, organisational infrastructure (policy, culture, goals) & technical infrastructure Levels of flexibility & interoperability (end- user, system & services) Non-use by research community to deposit content Quality & extent of participation levels ie content building & usage
21
Assessing the “fit” between innovation & institution Continuum consists of inputs, outputs & outcomes (content, services & systems) Gather evidence iro workflow efficiencies associated with the innovation (quantitative) Gather evidence that demonstrates the extent to which the IR has an effect on the individual /collective research community (qualitative)
22
IR evaluation: distinguish between your goals Significant PIs – mainly qualitative measures to gather evidence of impact on the research enterprise at end-user, institutional & national level Secondary level PIs – mainly quantitative measures that demonstrate efficiencies & effectiveness
23
ReRR – Performance Indicator framework Inputs (Content) Outputs (Services & Systems) Impact (Benefit) End-userquantitativeMore quantitative qualitative Research community: institutional quantitativeMore quantitative qualitative Research community: national quantitativeMore quantitative qualitative
24
Significant PIs should show University Administrators… Whether the IR is working according to plan Can the IR work better and for what purpose Are there lessons from current initiatives that need to be heeded What has emerged as important Is there a significant impact on the individual or collective research community What is its potential to strengthen, improve & raise the visibility of the individual or collective research community
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.