Download presentation
Presentation is loading. Please wait.
Published byMadeline Norton Modified over 8 years ago
1
Co-funded by the European Union under FP7-ICT-2009-6 Alliance Permanent Access to the Records of Science in Europe Network Co-ordinated by aparsen.eu #APARSEN WP14 Common Testing Environments Dr Ashley Hunter, ashley.hunter@tessella.com Tessella plc David Giaretta (APA and STFC) APARSEN Year 1 Review, Luxembourg, Feb 2012
2
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN WP14: Overview Title: Common Testing Environments Duration: April 2011 – February 2012 WP Leader: Tessella No. of Tasks: 2 Total PMs: 28 No. of Participants: 13
3
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN WP14: Objectives Collect together a set of environments to test the efficacy of tools and techniques for digital preservation against changes in: –Hardware –Software –Environment –Knowledge base of the designated communities Develop a common ‘shared’ approach to testing “It is probably reasonable to expect that each proposed preservation technique works well against certain types of digital objects or certain challenges; it is unlikely that there is a universal test.”
4
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN WP14: Partners PartnerEffort (PMs) 1 – STFC1 2 – APA1 6 – CSC3 7 – DNB3 12 – KNAW-DANS1 13 – KB1 17 – FORTH1 25 – UNITN2 26 – TESSELLA8 28 – SBA2 29 – IKI-RAS3 33 – UESSEX1 34 – CINES1
5
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN WP14: Tasks 1410 – Identification of testbed techniques and tools –This task collects together the various testbeds which are available 1420 – Testbed suite –This task produces a testbed suite with associated testbed procedures. To facilitate this partners will make their testbeds, procedures, test data and software available to other partners.
6
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1410: Test Environments Review Previous Test Environments Classified as: –Significant Properties based Testbed (SPT) –Transformational Information Properties based Testbed (TIPT) –Designated Community based Testbed (DCT) –Multi-Valent Testbed (MVT)
7
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Test Suite In-house Digital Preservation Test Environments –Safety Deposit Box (SDB), Tessella –KoLibRI, DNB –CASPAR, STFC / APA –SHAMAN*, DNB / InConTec External Test Environments –Seek a simple process so as to be able to ‘rate’ additional environments through their own statements of capability
8
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Classifications Use classification schemes to define the known Test Environment landscape –Helps us to move out of our comfort zones –Helps us find out the boundaries of effectiveness of various preservation tools/techniques Classify Test Cases by: –Object Type (Information or Digital?) –Threat to Access and Re-Use –Preservation Technique (Strategy) Identify the GAPs in our coverage
9
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Object Types Digitally encoded information objects Proposal: initially divide up objects as: –Rendered vs non-rendered –Static vs dynamic –Simple vs composite –Passive vs active Note –these are not clear-cut divisions –there may be other cuts –at least we have to try to think about different types of digital objects
10
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Threats PARSE.insight –Users may be unable to understand or use the data e.g. the semantics, format, processes or algorithms involved. –Non-maintainability of essential hardware, software or support environment may make the information inaccessible. –The chain of evidence may be lost and there may be lack of certainty of provenance or authenticity. –Access and use restrictions may make it difficult to reuse data, or alternatively may not be respected in future. –Loss of ability to identify the location of data. –The current custodian of the data, whether an organisation or project, may cease to exist at some point in the future. –The ones we trust to look after the digital holdings may let us down.
11
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Techniques Transformation Emulation RepInfo Network Changes of the bit- sequences of Content Data Object } NO changes of the bit- sequences of Content Data Object Emulators are RepInfo Extended to Preservation Networks Other Approaches?
12
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: User Scenarios These define the Digital Preservation test cases used to evaluate each of the preservation tools/techniques Initially based on contributions from APARSEN Partners Seek contributions from other active projects, or organisations, or individuals to expand our horizons!
13
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: User Scenario Template
14
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Example User Scenario
15
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Analysis of User Scenarios Develop a Test Environment capability matrix –Extract unique combinations of information object, threat, strategy (Test Cases) –Rate the ability of each preservation tool/technique to each Test Case –Plot the matrix Analysis also reveals interesting behaviours –Images for Machine Vision (non human) interpretation –Audio files containing data other than sound (container)
16
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN Preliminary view
17
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN 1420: Test Environment Landscape Define current capability space See landscape change with time Identifies areas fur further analysis and research by the DP community
18
WP14: Common Testing Environments Dr Ashley Hunter, Tessella APARSEN Year One Review, Luxembourg, Feb 2012 Co-funded by the European Union under FP7-ICT-2009-6 aparsen.eu #APARSEN WP14: Next Steps Conclude the gathering of User Scenarios Analyse classifications of Objects, Threats and Techniques into unique tests cases Evaluate preservation tools/techniques against test cases with applicable test environments Create a current snapshot of the Digital Preservation Landscape Identify GAPs in DP Landscape 3 – 6 month extension in time (not cost) to allow partners to complete these activities
19
aparsen.eu #APARSEN Network of Excellence
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.