Download presentation
Presentation is loading. Please wait.
1
WP14 Common Testing Environments
Dr Ashley Hunter, Tessella plc David Giaretta (APA and STFC) APARSEN Year 1 Review, Luxembourg, Feb 2012
2
WP14: Overview Title: Common Testing Environments
Duration: April 2011 – February 2012 WP Leader: Tessella No. of Tasks: 2 Total PMs: 28 No. of Participants: 13
3
WP14: Objectives Collect together a set of environments to test the efficacy of tools and techniques for digital preservation against changes in: Hardware Software Environment Knowledge base of the designated communities Develop a common ‘shared’ approach to testing “It is probably reasonable to expect that each proposed preservation technique works well against certain types of digital objects or certain challenges; it is unlikely that there is a universal test.”
4
WP14: Partners Partner Effort (PMs) 1 – STFC 1 2 – APA 6 – CSC 3
7 – DNB 12 – KNAW-DANS 13 – KB 17 – FORTH 25 – UNITN 2 26 – TESSELLA 8 28 – SBA 29 – IKI-RAS 33 – UESSEX 34 – CINES
5
WP14: Tasks 1410 – Identification of testbed techniques and tools
This task collects together the various testbeds which are available 1420 – Testbed suite This task produces a testbed suite with associated testbed procedures. To facilitate this partners will make their testbeds, procedures, test data and software available to other partners.
6
1410: Test Environments Review Previous Test Environments
Classified as: Significant Properties based Testbed (SPT) Transformational Information Properties based Testbed (TIPT) Designated Community based Testbed (DCT) Multi-Valent Testbed (MVT)
7
1420: Test Suite In-house Digital Preservation Test Environments
Safety Deposit Box (SDB), Tessella KoLibRI, DNB CASPAR, STFC / APA SHAMAN*, DNB / InConTec External Test Environments Seek a simple process so as to be able to ‘rate’ additional environments through their own statements of capability
8
1420: Classifications Use classification schemes to define the known Test Environment landscape Helps us to move out of our comfort zones Helps us find out the boundaries of effectiveness of various preservation tools/techniques Classify Test Cases by: Object Type (Information or Digital?) Threat to Access and Re-Use Preservation Technique (Strategy) Identify the GAPs in our coverage
9
1420: Object Types Digitally encoded information objects
Proposal: initially divide up objects as: Rendered vs non-rendered Static vs dynamic Simple vs composite Passive vs active Note these are not clear-cut divisions there may be other cuts at least we have to try to think about different types of digital objects
10
1420: Threats PARSE.insight
Users may be unable to understand or use the data e.g. the semantics, format, processes or algorithms involved. Non-maintainability of essential hardware, software or support environment may make the information inaccessible. The chain of evidence may be lost and there may be lack of certainty of provenance or authenticity. Access and use restrictions may make it difficult to reuse data, or alternatively may not be respected in future. Loss of ability to identify the location of data. The current custodian of the data, whether an organisation or project, may cease to exist at some point in the future. The ones we trust to look after the digital holdings may let us down.
11
} 1420: Techniques Transformation Emulation RepInfo Network
Changes of the bit-sequences of Content Data Object } Emulators are RepInfo NO changes of the bit-sequences of Content Data Object Extended to Preservation Networks Other Approaches?
12
1420: User Scenarios These define the Digital Preservation test cases used to evaluate each of the test environments Initially based on contributions from APARSEN Partners Seek contributions from other active projects, or organisations, or individuals to expand our horizons!
13
1420: User Scenario Template
14
1420: Example User Scenario
15
1420: Analysis of User Scenarios
Develop a Test Environment capability matrix Extract unique combinations of information object, threat, strategy (Test Cases) Rate the ability of each Test Environment to each Test Case Plot the matrix Analysis also reveals interesting behaviours Images for Machine Vision (non human) interpretation Audio files containing data other than sound (container)
16
Preliminary view
17
1420: Test Environment Landscape
Define current capability space See landscape change with time Identifies areas fur further analysis and research by the DP community
18
WP14: Next Steps Conclude the gathering of User Scenarios
Analyse classifications of Objects, Threats and Techniques into unique tests cases Evaluate Test Environments against test cases Create a current snapshot of the Digital Preservation Landscape Identify GAPs in DP Landscape 3 – 6 month extension in time (not cost) to allow partners to complete these activities
19
Network of Excellence
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.