V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA.

Slides:



Advertisements
Similar presentations
Gigabyte Bandwidth Enables Global Co-Laboratories Prof. Harvey Newman, Caltech Jim Gray, Microsoft Presented at Windows Hardware Engineering Conference.
Advertisements

1 AMY Detector (eighties) A rather compact detector.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
IN2P3 Computing Center, Lyon, France Tier1 for Atlas, ALICE, CMS, LHCb
Click to edit Master title style European AFS and Kerberos Conference Welcome to CERN CERN, Accelerating Science and Innovation CERN, Accelerating Science.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Oliver Gutsche - CMS / Fermilab Analyzing Millions of Gigabyte of LHC Data for CMS - Discover the Higgs on OSG.
Alain Romeyer - Dec Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS.
Grids: Why and How (you might use them) J. Templon, NIKHEF VLV T Workshop NIKHEF 06 October 2003.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
POLITEHNICA University of Bucharest California Institute of Technology National Center for Information Technology Ciprian Mihai Dobre Corina Stratan MONARC.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
D. Duellmann, CERN Data Management at the LHC1 Data Management at CERN’s Large Hadron Collider (LHC) Dirk Düllmann CERN IT/DB, Switzerland
CERN IT Department CH-1211 Genève 23 Switzerland t Status and Plans TERENA 2010 Vilnius, Lithuania John Shade /CERN.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
KIT – University of the State of Baden-Wuerttemberg and National Research Center of the Helmholtz Association Steinbuch Centre for Computing (SCC)
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
1 Chasing the Higgs boson with a worldwide distributed trigger system Sander Klous NIKHEF VENI proposal 2006.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A.
1 Florida Cyberinfrastructure Development: SSERCA Fall Internet2 Meeting Raleigh, Va October 3, 2011 Paul Avery University of Florida
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
GRID development in Russia 1) Networking for science and higher eductation 2) Grid for HEP 3) Digital Divide V. Ilyin SINP MSU.
HENP, Grids and the Networks They Depend Upon Shawn McKee March 2004 National Internet2 Day.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Fermilab June 29, 2001Data collection and handling for HEP1 Matthias Kasemann Fermilab Overview of Data collection and handling for High Energy Physics.
Enabling IP Services on Industry Standard Platforms Triple Play Symposium Robert (Bob) Heymann Director, Market Development Communications Infrastructure.
Large-Scale Computing with Grids Jeff Templon KVI Seminar 17 February 2004.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Tim 18/09/2015 2Tim Bell - Australian Bureau of Meteorology Visit.
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
V.A. Ilyin, ICFA DDW’06, Cracow, 11 October 2006 Networking and Grid in Russia V.A. Ilyin DDW’06, Cracow 11 October 2006.
LHC experimental data: From today’s Data Challenges to the promise of tomorrow B. Panzer – CERN/IT, F. Rademakers – CERN/EP, P. Vande Vyvre - CERN/EP Academic.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Russia-CERN Joint Working Group on LHC Computing Russia-CERN Joint Working Group on LHC Computing, 19 March, 2004, CERN V.A. Ilyin 1.Some about JWGC 2.Russia.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
Introduction to the Networked Communications Environment Lisa Horner, Global Partners.
LHC Computing, CERN, & Federated Identities
JINR WLCG Tier 1 for CMS CICC comprises 2582 Core Disk storage capacity 1800 TB Availability and Reliability = 99% 49% 44% JINR (Dubna)End of.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
V. Ilyin, Russia – EU, Russia participation in EGEE stable core infrastructure - new applications/new resources/new.
1 (Brief) Introductory Remarks On Behalf of the U.S. Department of Energy ESnet Site Coordinating Committee (ESCC) W.Scott Bradley ESCC Chairman
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
Storage Management on the Grid Alasdair Earl University of Edinburgh.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
IPCEI on High performance computing and big data enabled application: a pilot for the European Data Infrastructure Antonio Zoccoli INFN & University of.
EC Review – 01/03/2002 – F.Carminati – Accomplishments of the project from the end user point of view– n° 1 Accomplishments of the project from the end.
The ATLAS detector … … is composed of cylindrical layers: Tracking detector: Pixel, SCT, TRT (Solenoid magnetic field) Calorimeter: Liquid Argon, Tile.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
NRENs and their Responsibilities UbuntuNet Leadership Seminar 28 April 2016, Dar es Salaam Duncan Martin, Consultant.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
Enabling Grids for E-sciencE INFSO-RI Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Distributed Data Analysis in HEP Piotr MALECKI Institute of Nuclear Physics Kawiory 26A, Kraków, Poland.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
] Open Science Grid Ben Clifford University of Chicago
Dagmar Adamova, NPI AS CR Prague/Rez
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
CERN, the LHC and the Grid
ورود اطلاعات بصورت غيربرخط
CMS Institutes: 202 Institutes in 46 countries
Presentation transcript:

V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA

V.A. Ilyin,, RIGF, 14 May 2010 Distributed analysis of the LHC data over Internet: jobs each moment and Pbytes of data for thousends of physicists

LHC project started in First production session: April 2010 — end of 2011.

Online system Multi-level trigger Filter out background Reduce data volume Online reduction 10 7 Trigger menus Select interesting events Filter out less interesting level 1 - special hardware 40 MHz (40 TB/sec)‏ level 2 - embedded processors level 3 - PCs 75 KHz (75 GB/sec)‏ 5 KHz (5 GB/sec)‏ 100 Hz (100 MB/sec)‏ Data recording & offline analysis 100 events per second 1 Pbyte per year for offline analysis (CMS computing model) ‏

V.A. Ilyin,, RIGF, 14 May 2010 WLCG – World-wide LHC Computing Grid, development from Now it is production system Distributed analysis of LHC data over the Internet: jobs each moment and Pbytes of data for thousends of physicists

V.A. Ilyin,, RIGF, 14 May 2010 Challenge: production over IP...

V.A. Ilyin,, RIGF, 14 May 2010 WLCG networking Tier0 at CERN (data preprocessing) ‏ 12 Tier1 Centers (data processing) ‏ Tier0 Tier1s, Tier1s Tier1s: OPN (Optical Private Network) – Gbps 150 Tier2 Centers (data analysis, Monte-Carlo) ‏ Tier1s Tier2s, Tier2s Tiers: GEANT, NRENS,... commodity Interenet DIGITAL DIVIDE !

V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science two revolutionary events: - NSF «presented» DNS to the «commodity» governance - WWW had come from the science kindergarten to the adult life During 90's-00's Internet has got many faces: - commercial - social - national - governmental - private -... Now science descryes (first of all) COMMERCIAL face of the Internet, with (afterwords) administrative actions at governmental levels

V.A. Ilyin,, RIGF, 14 May 2010 This consecusion: commercial face at first, and then governmental interventions in many cases (especially in developed countries)‏ means lack of corresponding Internet Governance solutions for Science life in Internet... science should not be as VIP subcommunity in Internet science is (and will be, for sure) at the Internet technological top, showing realities for future innovations in commodity life science shows no direct commercial interests, but can play an accelerator role for trading Internet services the problem looks as local/specific etc, but... No solutions on the surface...