Trusty Research Infrastructure?

Slides:



Advertisements
Similar presentations
European Middleware Initiative (EMI) 101. State-Of-The-Art in Grids Today High Performance Computing (HPC) InfrastructuresHigh Throughput Computing (HTC)
Advertisements

DSA and the Certification Framework Ingrid Dillo Data Archiving and Networked Services DSA Conference, Florence 10 December 2012.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI AAI in EGI Status and Evolution Peter Solagna Senior Operations Manager
Federated Identity Management for Research Communities (FIM4R) David Kelsey (STFC-RAL) EGI TF, AAI workshop 19 Sep 2012.
FIM-ig Federated Identity Management Interest Group.
New Generation SDI and Cyber-Infrastructure Prof. Guoqing Li CEODE/CAS March 29, 2009, Newport Beach, USA Presented to 4th China-US Roundtable Meeting.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Report from Breakout Session 1.2 Secure Consumerization: the Genuine Trustworthiness Revolution Chair: Craig Lee Rapporteur: Paolo Mazzetti.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
EGI-InSPIRE RI EGI-InSPIRE RI EGI-InSPIRE EGI services for the long tail of science Peter Solagna Senior Operations.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
Your university or experiment logo here Caitriana Nicholson University of Glasgow Dynamic Data Replication in LCG 2008.
ICETE 2012 Joint Conference on e-Business and Telecommunications Hotel Meliá Roma Aurelia Antica, Rome, Italy July
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
26/05/2005 Research Infrastructures - 'eInfrastructure: Grid initiatives‘ FP INFRASTRUCTURES-71 DIMMI Project a DI gital M ulti M edia I nfrastructure.
NRENs, Grids and Integrated AAI In Search For the Utopian Solution Christos Kanellopoulos AUTH/GRNET October 17 th, 2005 skanct at physics.auth.gr 2nd.
DTI Mission – 29 June LCG Security Ian Neilson LCG Security Officer Grid Deployment Group CERN.
1 The European Open Science Cloud: Open Day Event EMBL, Heidelberg, 20 January 2016 Joint Research Centre (JRC) The European Commission’s in-house science.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Virtual Team "Support for genome analysis and protein folding“ investigate and report on applications required and.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Evolution of AAI for e- infrastructures Peter Solagna Senior Operations Manager.
Distributed Data Access Control Mechanisms and the SRM Peter Kunszt Manager Swiss Grid Initiative Swiss National Supercomputing Centre CSCS GGF Grid Data.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
Resource Provisioning EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No The use of the.
European Grid Initiative The EGI Federated Cloud as Educational and Training Infrastructure for Data Science Tiziana Ferrari/ EGI.eu.
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Support to scientific.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Aalto Data Repository.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Co-ordinated by aparsen.eu #APARSEN Co-funded by the European Union under FP7-ICT Business Development activities after project completion Ruben.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI DCI Collaborations Steven Newhouse 15/09/2010 DCI Vision1.
EGI towards H2020 Feedback (from survey)
Bob Jones EGEE Technical Director
Accessing the VI-SEEM infrastructure
Mitä sovelluksia verkossasi liikkuu? Ja miten sovellukset toimivat?
(Prague, March 2009) Andrey Y Shevel
RCauth.eu CILogon-like service in EGI and the EOSC
A Dutch LHC Tier-1 Facility
Marshfield Area Technical Council
AAI for a Collaborative Data Infrastructure
Defining and tracking requirements for New Communities
BBMRI Competence Centre Status Report
Dan Bieler, Principal Analyst
@tarashears.
Wide Area Network.
Grid Computing.
Short to Medium Term Priority issues for EGI, EMI, anD others
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
CCNET Managed Services
Linked Challenges Virtualisation has a key role to play….
Jumpstart Solution: Novell Active Information Portal
EGI-Engage Engaging the EGI Community towards an Open Science Commons
Solutions for federated services management EGI
DATA SPHINX & EUDAT Collaboration
EGI Webinar - Introduction -
NFFA Europe.
RCauth.eu CILogon-like service in EGI and the EOSC
David Kelsey (STFC-RAL)
Road Infrastructure for Road Vehicles Automation
Distributed Systems through Web Services
AAI in EGI Status and Evolution
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Expand portfolio of EGI services
The LHC Computing Grid Visit of Professor Andreas Demetriou
EOSC-hub Contribution to the EOSC WGs
Presentation transcript:

Trusty Research Infrastructure? Digital Marketplaces Using Novel Infrastructure Models I2GS2018 Panel Session

Services for (a lot of) Research Data … and that’s just the European end! Sharing common infrastructure trust and much (and converging) joint AAI services

Built on ‘balanced global systems’ EGI biomedical LIGO/Virgo gravitational waves LHC Alice LHC Atlas LHCb NL: WeNMR Full mesh with dynamic data placement interconnected with sufficient bandwidth for ‘opportunistic’ movement of data to compute Distributed Object Store (and caching edges not shown) using credential delegation through the caches Image credit Data Lake: Andreas-Joachim Peters, CERN EOS team, http://indico4.twgrid.org/indico/event/4/session/15/contribution/50/material/slides/0.pdf Image: LHC One

But: use cases requiring more trust? Life Sciences AAI: need for a “Dataset Authorization Service” … but different requirements for, say, genome infrastructures and biobanks Same researcher, different data sets – and ethics does not allow them to be combined we need technical role separation for the same user? … Can that be done in an ecosystem of shared services? Can we retain the efficient balance that today allows us to use any resource anywhere? Is certification of data centres the solution that will last? Or neutral places? Will it result in a contraction to only ISO 27k and ISM audited data centres? Are we about to loose opportunistic resource usage that helped exciting science, from Higgs to GW? Or can secure overlays or other novel infrastructures help us remain efficient AND trustworthy?

Let’s explore the possibilities!

Abstract Research and e-Infrastructures, and science in general, have for long been generating large amounts of data, and built a distributed infrastructure to cope with it. The premise has been that all components, compute, storage, and the network in-between, are sufficiently balanced that technology limitations in general no longer play a role in data distribution. The prolific use of ‘opportunistic compute’ resources by some of the LHC and other experiments bears that out: moving the data is no longer a ‘real’ issue. But now for the first time policy considerations and the need for higher trust start to change that premise: there are real access control needs on research data relating to people, really large data sets start appearing that have embargo controls on publication, and even a single researcher may have data that is illegal or unethical to be combined – and role separation is needed even here. And the rise of ‘distributed caching’ models for data require access rights to travel with the data – either physically or by cryptographic means. Dataset authorization systems are fast becoming necessary parts of a research AAI solution, and linking compute resources to such research data very much an open question. Yet only through collaborative analysis does the data actually result in real research outcomes. Will pushing compute to neutral and extra-territorial facilities alleviate the trust issue? Is the ever-increasing push for ISO audits and ISM trust marks for service providers actually reducing risk? Should we abandon the opportunistic resource use and have our analysis models driven by data placement? Do we have to abandon opportunistic compute and move everything to where the data is? Or can we here find a data compute model that allows both data owners to manage access as well as federated service providers to trust that what they the code or container they are about to execute can in itself be trusted?