ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.

Slides:



Advertisements
Similar presentations
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Advertisements

DELOS Highlights COSTANTINO THANOS ITALIAN NATIONAL RESEARCH COUNCIL.
Tango Meeting DESY Status Report Thorsten Kracht Grenoble, 13. May 2009.
Slide: 1 Welcome to the workshop ESRFUP-WP7 User Single Entry Point.
1 Project overview Presented at the Euforia KoM January, 2008 Marcin Płóciennik, PSNC, Poland.
1 The EU e-Infrastructure Programme EUROCRIS Conference, Brussels, 17 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
High Performance Computing Course Notes Grid Computing.
NPSS Field of Interest History and Discussion. 2 The fields of interest of the Society are the nuclear and plasma sciences. The Society shall devote itself.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Steven Newhouse Requirements within EGI.
E-Infrastructures A key component for the European Industry and Services Christian SAGUEZ e-Infrastructures-Roma-Dec 9th-03.
Business Continuity and DR, A Practical Implementation Mich Talebzadeh, Consultant, Deutsche Bank
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
1 1 st Virtual Forum on Global Research Communities Brussels, 12 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT and e-Infrastructures”
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Information Technology
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
COnvergence of fixed and Mobile BrOadband access/aggregation networks Work programme topic: ICT Future Networks Type of project: Large scale integrating.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
DORII Joint Research Activities DORII Joint Research Activities Status and Progress 4 th All-Hands-Meeting (AHM) Alexey Cheptsov on.
1 European policies for e- Infrastructures Belarus-Poland NREN cross-border link inauguration event Minsk, 9 November 2010 Jean-Luc Dorel European Commission.
1 Common Challenges Across Scientific Disciplines Laurence Field CERN 18 th November 2013.
A Lightweight Platform for Integration of Resource Limited Devices into Pervasive Grids Stavros Isaiadis and Vladimir Getov University of Westminster
INFSO-RI Enabling Grids for E-sciencE EGEODE VO « Expanding GEosciences On DEmand » Geocluster©: Generic Seismic Processing Platform.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Photon Diagnostic Station For TAC IR-FEL Test Facility ILHAN TAPAN* *on behalf of the TAC collaboration Uludag Universitesi, Bursa,16059, TURKEY References.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Cracow Grid Workshop October 2009 Dipl.-Ing. (M.Sc.) Marcus Hilbrich Center for Information Services and High Performance.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Networked Embedded and Control Systems WP ICT Call 2 Objective ICT ICT National Contact Points Mercè Griera i Fisa Brussels, 23 May 2007.
WP18: High-Speed Data Recording Krzysztof Wrona, European XFEL 07 October 2011 CRISP.
1 European e-Infrastructure experiences gained and way ahead OGF 20 / EGEE User’s Forum 9 th May 2007 Mário Campolargo European Commission - DG INFSO Head.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
Free-Electron Laser at the TESLA Test Facility More than 50 institutes from 12 countries are involved in the TESLA Project The TESLA Collaboration: Three.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
26/05/2005 Research Infrastructures - 'eInfrastructure: Grid initiatives‘ FP INFRASTRUCTURES-71 DIMMI Project a DI gital M ulti M edia I nfrastructure.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
1 The ILC Control Work Packages. ILC Control System Work Packages GDE Oct Who We Are Collaboration loosely formed at Snowmass which included SLAC,
DESY Deutsches Electron Synchrotron Shima Bayesteh.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
DESY. Status and Perspectives in Particle Physics Albrecht Wagner Chair of the DESY Directorate.
The Global Scene Wouter Los University of Amsterdam The Netherlands.
RC ICT Conference 17 May 2004 Research Councils ICT Conference The UK e-Science Programme David Wallace, Chair, e-Science Steering Committee.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
The Helmholtz Association Project „Large Scale Data Management and Analysis“ (LSDMA) Kilian Schwarz, GSI; Christopher Jung, KIT.
E-infrastructure requirements from the ESFRI Physics, Astronomy and Analytical Facilities cluster Provisional material based on outcome of workshop held.
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Support to scientific.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Hall D Computing Facilities Ian Bird 16 March 2001.
Bob Jones EGEE Technical Director
MSc-Student Activities at the European XFEL
Pasquale Migliozzi INFN Napoli
Applications Using the EGEE Grid Infrastructure
Introducing European XFEL Enlightening Science
Readiness of ATLAS Computing - A personal view
Bulgaria’s research landscape and the context of CERN collaboration
WP18, High-speed data recording
© 2016 Global Market Insights, Inc. USA. All Rights Reserved Software Defined Networking Market to grow at 54% CAGR from 2017 to 2024:
Cloud Computing.
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL

ESFRI & e-Infrastructure Collaborations 2 21-Sep-2009, Krzysztof Wrona, European XFEL The European XFEL X-ray Free Electron Laser, currently being constructed, located in Germany From 2014 on, XFEL will generate ultrashort X-ray pulses that will be utilized by various users communities in a broad range of scientific disciplines New research opportunities due to the unique photon beam characteristic:  High repetition rate (30000 pulses/s)  Ultrashort pulse duration (few – 100fs)  Extreme intensities  Coherent radiation

ESFRI & e-Infrastructure Collaborations 3 21-Sep-2009, Krzysztof Wrona, European XFEL Science overview The X-ray flashes will allow scientists to decipher the structure of many more biomolecules and biological entities such as cells or membranes than is possible today.... to study biochemical reactions in action. Among other things, this will lead to a better understanding of viruses and will provide an important basis for future medicines.... to learn more about chemical processes such as catalysis, which plays an important role in nature and in the manufacturing of most chemical substances produced in industry.... to study new processes and materials that are required for harnessing solar energy.... to analyse the properties of various materials in order to develop completely new materials with revolutionary characteristics.... to gain new insights into the nanoworld– for instance about components with specific electronic, magnetic and optical properties. This will lay the foundation for tomorrow’s technologies.

ESFRI & e-Infrastructure Collaborations 4 21-Sep-2009, Krzysztof Wrona, European XFEL ICT vision ICT technologies are deeply involved in many aspects of the facility design, construction and operation The most challenging areas will cover:  Control of the facility devices  linear accelerator, photon beam systems, experimental stations, detectors, diagnostic devices  Data acquisition  High instantaneous data rates, multiple data sources, online data reduction  Data management  Data curation, archive and distribution  Data Analysis  Analysis technique very specific to a particular experiment

ESFRI & e-Infrastructure Collaborations 5 21-Sep-2009, Krzysztof Wrona, European XFEL ICT requirements Different experiments scenarios  User driven data production  Continuous data taking mode vs. experiments requiring complex setup and slow feedback loops High instantaneous data rates  up to 10GB/s from a single detector Large accumulated data volume  Estimated to 10PB/year (2015)  Data archive scalable to accommodate ~100PB/year  Raw data needs to be stored at the facility for at least one year Remote access to data and computing services  Single sign-on service  Data protection – fine grain authorization scheme required  Services for data catalogs querying and data retrieval, web portals, lightweight client tools

ESFRI & e-Infrastructure Collaborations 6 21-Sep-2009, Krzysztof Wrona, European XFEL ICT requirements Support for diverse user communities coming from physics, material science, chemistry, biology, medicine,…  Strong groups using advanced data analysis techniques  Small groups of scientists with minimal computing experience Data analysis  Support for scientific software repositories  Trivial parallelism (i.e. independent batch jobs) vs. complex MPI type of applications  Analysis techniques and algorithms still in development – no reliable CPU requirements yet Network  Reduced (or full) data samples need to be transferred to users  Many institutes with no connection to high bandwidth network  requires access to national level data centers

ESFRI & e-Infrastructure Collaborations 7 21-Sep-2009, Krzysztof Wrona, European XFEL Proposed architecture Common infrastructure for all experiments Remote access to data and computing resources Build the data management services on top of the Grid infrastructure and VO services (xfel.eu VO already exists) Use high bandwidth network for data distribution (can we apply tiered model?)

ESFRI & e-Infrastructure Collaborations 8 21-Sep-2009, Krzysztof Wrona, European XFEL Current status & Timeline 2009 – Public release of the European XFEL Computing Technical Design Report – evaluation, developments and testing 2013 – deployment and integration of DAQ and data management services 2014 – small scale, production quality system required for the photon beam lines commissioning 2015 – first full operation year of the facility – full functionality, production level

ESFRI & e-Infrastructure Collaborations 9 21-Sep-2009, Krzysztof Wrona, European XFEL Challenges & Suggestions Perform bulk data processing on site using remote access Build the data management and data distribution services on top of the Grid middleware and VO services Support many small user groups with different demands (data protection, various analysis techniques) Use e-Infrastructure to strengthen our user communities and ensure access to computing and data storage resources on the national level The services must be reliable and easy to use by non computing experts, small initial overhead for new users