Download presentation
Presentation is loading. Please wait.
Published byMaryann Bryant Modified over 9 years ago
1
Repository Audit and Certification DSA–WDS Partnership WG RDA Working Groups Meeting at NIST November 13-14, 2014
2
Working Group Members – Lesley Rickards (UK, PSMSL, WDS-SC) [Co-chair] – Mary Vardigan (USA, ICPSR, DSA Board) [Co-chair] – Kevin Ashley (UK, Digital Curation Centre) – Michael Diepenbroek (Germany, Pangaea, WDS-SC) – Ingrid Dillo (The Netherlands, DANS, DSA Board) – Françoise Genova (France, CDS, WDS-SC) – Hervé L’Hours (UK, UK Data Archive, DSA Board) – Guoqing Li (China, CEODE, WDS-SC) – Jean-Bernard Minster (USA, UCSD, Chair of WDS Scientific Committee) – Paul Trilsbeek (The Netherlands, MPI for Psycholinguistics, DSA Board) – Eleni Panagou, Ph.D. Candidate in Web Engineering, Democritus University of Thrace, Greece [RDA Early Career Researcher]
3
Context and Background Data Seal of Approval and World Data System both lightweight mechanisms for repository assessment DSA began in social science and humanities, WDS in natural and physical sciences but both expanding in scope Over past two years, both groups began to see commonalities and synergies When RDA Audit and Certification Interest Group established, exploring a partnership seemed natural
4
Working Group Goals Develop common catalog of criteria for basic repository assessment and certification Develop common procedures for assessment Implement a shared testbed for assessment Ultimately, create a shared framework for certification that includes other standards as well
5
Our Work So Far Began virtual meetings early in 2014 to map DSA and WDS criteria to each other Officially recognized as an RDA working group in May 2014 Considered an “example for a non-technical group” -- RDA is about building bridges In August created a summary mapping with draft common requirements
6
Procedures for Mapping Created comprehensive Google spreadsheet to have all information in one place Mapped the DSA criteria to the WDS criteria, and the WDS to the DSA Held lengthy discussions on each guideline Group members noted areas of agreement and gaps and documented them
7
General Findings The two catalogs have similarities and differences DSA guidelines more concise; WDS has multi-part criteria DSA focus on data management, not organizational stability WDS certification includes membership in the WDS and certification of services, not in scope for the DSA Overall, working together has been great
8
Mapping Summary Shows mappings along with notes on level of the match (good match, partial, gap, etc.) Reconciles the two standards with suggested common language for requirements Assigns a concept to each common requirement, e.g., Discovery, Appraisal, Continuity of Access Assigns ISO/TRAC label(s): Organizational Infrastructure, Digital Object Management, Technology
9
Mapping Summary Walk-through Summary Document
10
Context Please provide context for your repository. (1) Repository type (select from a typology -- e.g., domain repository). (2) Brief description of the repository’s Designated Community, “an identified group of potential Consumers who should be able to understand a particular set of information” (from OAIS). (3) Level of curation performed (select from a list). Partial Match
11
Appraisal The repository accepts data based on defined criteria to ensure relevance and understandability for data users. Partial Match
12
Mission/Scope The repository has an explicit mission to provide access to and preserve data in its domain. Partial Match
13
Documented storage procedures The repository applies documented processes and procedures in managing archival storage of the data. Good Match
14
Preservation plan The repository assumes responsibility for long- term preservation and manages this function in a planned and documented way. Partial Match
15
Workflows Archiving takes place according to defined workflows from ingest to dissemination. Partial Match
16
Data discovery and identification The repository enables users to discover the data and to refer to them in a persistent way through proper citation. Partial Match
17
Data integrity and authenticity The repository guarantees the integrity and authenticity of the data. Good Match
18
Technical infrastructure The technical infrastructure of the repository supports the tasks and functions necessary to effectively perform the mission. Partial Match
19
Security The repository maintains a careful plan to protect the safety of its holdings, the security of its facility, and the privacy of its users. OR The repository addresses security needs across its data, systems, personnel, and physical plant.
20
Licenses The repository maintains all applicable licenses covering data access and use and monitors compliance. Partial Match
21
Continuity of access The repository has a continuity plan to ensure ongoing access to and preservation of its holdings. Poor Match
22
Data quality Please provide a description of the mechanism used to ensure (to the largest extent possible) data quality, recognizing that there is a difference between scientific and technical quality. Alternative “wordy” version: The repository has appropriate internal expertise to address data and metadata quality through assessment of acquisitions, setting quality-related deposit criteria, and enriching data and metadata quality when appropriate to the mission and ensures sufficient information is available for end users to make quality-related evaluations. Poor Match
23
Confidentiality/Ethics When appropriate, the repository protects the subjects of research to the extent possible, taking into account disciplinary norms. Gap
24
Open access See statements approved by 2014 ICSU General Assembly. Gap
25
Organizational infrastructure The organization has adequate funding and sufficient numbers of qualified staff to effectively carry out the mission. Gap
26
Scientific guidance The repository adopts mechanism(s) to secure ongoing scientific guidance and feedback from recognized experts, and maintains publicly accessible documentation of such guidance. Gap
27
Next Steps Map to Nestor and ISO Finalize the harmonized requirements and put them out to the community as Version 1 Begin to work on aligning procedures Determine relationship of DSA and WDS to each other Create testbed for certification Investigate shared pool of reviewers
28
Links to Other RDA Groups Practical Policy – There may be a way to share policies across repositories and to integrate them into the assessment process (e.g., checks for integrity). Domain Repositories IG – This is a natural fit. We can work with the IG to get basic certification on the agenda of repositories and to test our new criteria.
29
Questions? Mary Vardigan – vardigan@umich.eduvardigan@umich.edu Lesley Rickards -- ljr@bodc.ac.ukljr@bodc.ac.uk
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.