Institutional Repositories The invisible & unwritten rules of Project Management in ETD Collection Building.

Slides:



Advertisements
Similar presentations
Planning Collaborative Spaces in Libraries
Advertisements

Capacity Building for Repositories Dr. Helena Asamoah-Hassan University Librarian, KNUST, Kumasi, Ghana at BioMed Open Access Africa Conference held at.
Partnering with Faculty / researchers to Enhance Scholarly Communication Caroline Mutwiri.
DARE: Digital Academic Repositories A new age in academic information provision in the Netherlands Henk Ellermann, DARE, 4/5 September 2003.
Creating Institutional Repositories Stephen Pinfield.
Opening access and closing the risk: delivering the mandate for e-theses deposit 10 th International Symposium on Electronic Theses and Dissertations Uppsala.
CURRENT ISSUES Current contents Over 3,000 items open access, 42% reports and working papers, 21% journal articles, 21% conference items, 7% book chapters,
Linking Repositories Scoping Study Key Perspectives Ltd University of Hull SHERPA University of Southampton.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Near East Plant Protection Network for Regional Cooperation & Knowledge Sharing Food and Agriculture Organization of the United Nations An Overview on.
Open Access in Summary Amos Kujenga EIFL-FOSS National Coordinator, Zimbabwe Lupane State University, October 2013 Lesotho College.
Learning without Borders: Internationalizing the Gator Nation M. David Miller Director, Quality Enhancement Plan Timothy S. Brophy Director, Institutional.
IFAD Reform towards a better development effectiveness How can we all do better? Mohamed Béavogui Director, West and Central Africa January 2009.
SEM Planning Model.
Introduction to Implementing an Institutional Repository Delivered to Technical Services Staff Dr. John Archer Library University of Regina September 21,
Quality evaluation and improvement for Internal Audit
Challenge Questions How good is our operational management?
10.5 Report Performance The process of collecting and distributing performance information, including status reports, progress measurements and forecasts.
Standards and Guidelines for Quality Assurance in the European
Company LOGO Leading, Connecting, Transforming UNC… …Through Its People Human Capital Management.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
Project Human Resource Management
Managing a Training Program Why train? Who will attend the training? What are the learning objectives? Strategies? Coverage? How will the training program.
Human Resources Management Course Objectives The purpose of this course is to learn the Project Management Institute (PMI) processes required to make.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Capacity Building Experiences of the UNESCO-IHE Institute for Water Education Maarten Blokland.
Impact assessment framework
5-7 November 2014 ADLSN - ADLC Practical Digital Content Management from Digital Libraries & Archives Perspective.
Commonwealth of Massachusetts Statewide Strategic IT Consolidation (ITC) Initiative ANF IT Consolidation Website Publishing / IA Working Group Kickoff.
Electronic Theses at Rhodes University presented by Irene Vermaak Rhodes University Library National ETD Project CHELSA Stakeholder Workshop 5 November.
Preserving Digital Collections for Future Scholarship Oya Y. Rieger Cornell University
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
THE ROAD TO OPEN ACCESS A guide to the implementation of the Berlin Declaration Frederick J. Friend OSI Open Access Advocate JISC Consultant Honorary Director.
Towards a European network for digital preservation Ideas for a proposal Mariella Guercio, University of Urbino.
Assessment and Learning in Practice Settings (ALPS) © Planning a mobile learning project ALPS Conference March 2010 Robert Campbell.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Evaluation and Impact of Entrepreneurial Education and Training Malcolm Maguire Transnational High Level Seminar on National Policies and Impact of Entrepreneurship.
HBCU LIBRARY LEADERSHIP INSTITUTE II PROJECT REPORT IMPLEMENTATION OF AN INSTITUTIONAL REPOSITORY AT FAYETTEVILLE STATE UNIVERSITY.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Monitoring and Evaluation
Chapter 3 Strategic Information Systems Planning.
Caribbean Community Secretariat 2nd meeting of the Advisory Group on Statistics San Ignacio – Belize 25 June 2008 NSDS: FROM DESIGN TO IMPLEMENTATION.
Teaching at the University of Luxembourg: Organization, quality assurance and evaluation of student achievements
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Using OMB Section 508 reporting in addressing your agency's program maturity. How to Measure Your Agency's 508 Program.
Transforming Patient Experience: The essential guide
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Kathy Corbiere Service Delivery and Performance Commission
Catholic Charities Performance and Quality Improvement (PQI)
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
The CSO’s IT Strategy and the GSBPM IT Directors Group October 2010 Joe Treacy Central Statistics Office Ireland.
Launching the Dean digitally : the Jonathan Jansen Collection in UPSpace eIFL.net in co-operation with the Research Library Consortium Institutional repositories.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Continual Service Improvement Methods & Techniques.
Being The Best We Can A self-evaluation & improvement process for libraries Key results for Victoria’s public library services.
Introduction to the quality system in MOHE Prof. Hala Salah Consultant in NQAAP.
Session 2: Developing a Comprehensive M&E Work Plan.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
IFLA: International Advocacy Programme. Address the information gap of library workers at community, national and regional levels Build capacity among.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Evaluation What is evaluation?
Process and customizations
Gathering a credible evidence base
Accountability: an EU perspective
Quantifying the value of our libraries. Are our systems ready?
How to Design and Implement Research Outputs Repositories
A Focus on Outcomes and Impact
OPEN ACCESS POLICY Larshan Naicker Rhodes University Library
Presentation transcript:

Institutional Repositories The invisible & unwritten rules of Project Management in ETD Collection Building

PM Life Cycle according to the textbook Feasibility study Business analysis Planning, designing & resourcing Execution Implementation Close out

PM the real-world perspective Core elements Scope Time Cost Quality Means to achieve Integration People Communication Risk

Project Risk Management risk areas Human Resources management QA Collections Equipment & IT Internet access Scheduling & rate of production

Project Management and people Textbook Staff acquisitions Team development What happens Urgent vs important University calendar Training & retraining

Quality Assurance Metadata Developing guidelines Developing subject thesauri Training Peer review Digital files Agreeing an accepted standard Random sampling of pages, graphs, appendices Integrity of digital file with print version Front & content pages

Collections Agree scope of collection & timelines Integrate workflow across: student & supervisor Dean & Faculty Registrar & bureaucracy Library & internal processes

Internet access Webpage development Skills & training Dependence on NRF Protection of content Managed access Copyright issues

Rate of production Project host (NRF/CHELSA) expectations Unrealistic target setting Misjudged scheduling Metadata production dependent on digital file production (Greenstone; D Space?) Impact on staff morale Impact on project progress

Monitoring production Spreadsheets to monitor Set targets (realistic ones) individuals weekly totals Weekly production meetings

Lessons learned 6 phases of a project Enthusiasm Disillusionment Panic Search for the guilty Punishment of the innocent Praise & honours for the non-participants

Lessons learned on the ground Allocate time for initial team training Digitisation projects are about managing production lines Pressured environment driven by budgets & timelines Direct correlation between high output and team size Anticipate varying capacity levels across team Not to be bolted onto existing workload of staff members

In conclusion Challenges Team development is time-consuming & demanding Role boundaries must be clear Anticipate problems Manage change Don’t become dependent on 2-3 key people

Institutional Repositories Evaluating the impact of the ETD collection in the institution

Current dilemma IRs are innovative but often marginalised technologies Difficult to demonstrate impact on the research enterprise of the university University Administrators doubt the “institutional good” without demonstrable evidence No consensus on agreed set of Performance Indicators (PI) or metrics

ReRR launched for the right reasons in 2006 Enhanced visibility of research outputs Increased dissemination of institutional scholarship Preservation & long-term access to institutional scholarship Opportunity to educate faculty & researchers about ©, open access publishing

ReRR – innovation caught between a rock… Within RU viewed as a library activity & resource Reliance on quantitative PIs to demonstrate benefit Open access publishing is not without costs Few libraries have dedicated budgets for IR operating costs

…and a hard place High usage evidence suggests IRs part of the “research good” – not recognised by University Administrators Research policy & decision makers unconvinced that IR is a strategic research tool Low awareness among Faculty & few incentives to use open access publishing Reliance on non-strategic PIs to evaluate

Quantitative PIs used to evaluate IR services Gross number of items as well as retrieval using hits & downloads Levels of active community engagement (gappy vs continuous) Time-increment measures vs one-time only counts Content material types (proportion of pre- print or post-print items)

Qualitative PIs used to evaluate impact Fit between IR, organisational infrastructure (policy, culture, goals) & technical infrastructure Levels of flexibility & interoperability (end- user, system & services) Non-use by research community to deposit content Quality & extent of participation levels ie content building & usage

Assessing the “fit” between innovation & institution Continuum consists of inputs, outputs & outcomes (content, services & systems) Gather evidence iro workflow efficiencies associated with the innovation (quantitative) Gather evidence that demonstrates the extent to which the IR has an effect on the individual /collective research community (qualitative)

IR evaluation: distinguish between your goals Significant PIs – mainly qualitative measures to gather evidence of impact on the research enterprise at end-user, institutional & national level Secondary level PIs – mainly quantitative measures that demonstrate efficiencies & effectiveness

ReRR – Performance Indicator framework Inputs (Content) Outputs (Services & Systems) Impact (Benefit) End-userquantitativeMore quantitative qualitative Research community: institutional quantitativeMore quantitative qualitative Research community: national quantitativeMore quantitative qualitative

Significant PIs should show University Administrators… Whether the IR is working according to plan Can the IR work better and for what purpose Are there lessons from current initiatives that need to be heeded What has emerged as important Is there a significant impact on the individual or collective research community What is its potential to strengthen, improve & raise the visibility of the individual or collective research community