Institutional Repositories The invisible & unwritten rules of Project Management in ETD Collection Building
PM Life Cycle according to the textbook Feasibility study Business analysis Planning, designing & resourcing Execution Implementation Close out
PM the real-world perspective Core elements Scope Time Cost Quality Means to achieve Integration People Communication Risk
Project Risk Management risk areas Human Resources management QA Collections Equipment & IT Internet access Scheduling & rate of production
Project Management and people Textbook Staff acquisitions Team development What happens Urgent vs important University calendar Training & retraining
Quality Assurance Metadata Developing guidelines Developing subject thesauri Training Peer review Digital files Agreeing an accepted standard Random sampling of pages, graphs, appendices Integrity of digital file with print version Front & content pages
Collections Agree scope of collection & timelines Integrate workflow across: student & supervisor Dean & Faculty Registrar & bureaucracy Library & internal processes
Internet access Webpage development Skills & training Dependence on NRF Protection of content Managed access Copyright issues
Rate of production Project host (NRF/CHELSA) expectations Unrealistic target setting Misjudged scheduling Metadata production dependent on digital file production (Greenstone; D Space?) Impact on staff morale Impact on project progress
Monitoring production Spreadsheets to monitor Set targets (realistic ones) individuals weekly totals Weekly production meetings
Lessons learned 6 phases of a project Enthusiasm Disillusionment Panic Search for the guilty Punishment of the innocent Praise & honours for the non-participants
Lessons learned on the ground Allocate time for initial team training Digitisation projects are about managing production lines Pressured environment driven by budgets & timelines Direct correlation between high output and team size Anticipate varying capacity levels across team Not to be bolted onto existing workload of staff members
In conclusion Challenges Team development is time-consuming & demanding Role boundaries must be clear Anticipate problems Manage change Don’t become dependent on 2-3 key people
Institutional Repositories Evaluating the impact of the ETD collection in the institution
Current dilemma IRs are innovative but often marginalised technologies Difficult to demonstrate impact on the research enterprise of the university University Administrators doubt the “institutional good” without demonstrable evidence No consensus on agreed set of Performance Indicators (PI) or metrics
ReRR launched for the right reasons in 2006 Enhanced visibility of research outputs Increased dissemination of institutional scholarship Preservation & long-term access to institutional scholarship Opportunity to educate faculty & researchers about ©, open access publishing
ReRR – innovation caught between a rock… Within RU viewed as a library activity & resource Reliance on quantitative PIs to demonstrate benefit Open access publishing is not without costs Few libraries have dedicated budgets for IR operating costs
…and a hard place High usage evidence suggests IRs part of the “research good” – not recognised by University Administrators Research policy & decision makers unconvinced that IR is a strategic research tool Low awareness among Faculty & few incentives to use open access publishing Reliance on non-strategic PIs to evaluate
Quantitative PIs used to evaluate IR services Gross number of items as well as retrieval using hits & downloads Levels of active community engagement (gappy vs continuous) Time-increment measures vs one-time only counts Content material types (proportion of pre- print or post-print items)
Qualitative PIs used to evaluate impact Fit between IR, organisational infrastructure (policy, culture, goals) & technical infrastructure Levels of flexibility & interoperability (end- user, system & services) Non-use by research community to deposit content Quality & extent of participation levels ie content building & usage
Assessing the “fit” between innovation & institution Continuum consists of inputs, outputs & outcomes (content, services & systems) Gather evidence iro workflow efficiencies associated with the innovation (quantitative) Gather evidence that demonstrates the extent to which the IR has an effect on the individual /collective research community (qualitative)
IR evaluation: distinguish between your goals Significant PIs – mainly qualitative measures to gather evidence of impact on the research enterprise at end-user, institutional & national level Secondary level PIs – mainly quantitative measures that demonstrate efficiencies & effectiveness
ReRR – Performance Indicator framework Inputs (Content) Outputs (Services & Systems) Impact (Benefit) End-userquantitativeMore quantitative qualitative Research community: institutional quantitativeMore quantitative qualitative Research community: national quantitativeMore quantitative qualitative
Significant PIs should show University Administrators… Whether the IR is working according to plan Can the IR work better and for what purpose Are there lessons from current initiatives that need to be heeded What has emerged as important Is there a significant impact on the individual or collective research community What is its potential to strengthen, improve & raise the visibility of the individual or collective research community