Download presentation
Presentation is loading. Please wait.
Published byCameron Short Modified over 9 years ago
1
Ian Bird LHCC Referees’ meeting; CERN, 11 th June 2013 March 6, 2013 Ian.Bird@cern.ch1
2
Main points Several countries stated that the goal of aiming to fit within constant budgets would be appreciated - But concern that equipment renewal must fit within this as well as capacity expansion Misunderstanding of current situation? Some discussion on desirability of seeking funding asap to retain key personnel in various countries as EC funds come to an end - Problem is that H2020 unlikely to provide funds before at least 1 year (likely 18 months) from now Chair of RSG will work together with MB to try and improve the requirement/scrutiny/pledge process prior to October RRB – to reflect experience and adapt better to budget and procurement timescales March 6, 2013 Ian.Bird@cern.ch2
3
March 6, 2013 Ian.Bird@cern.ch3 CPUDiskTape Situation following April 2013 RRB – RSG report
4
March 6, 2013 Ian.Bird@cern.ch4 Tier 0 CPU Tier 0 Disk Tier 0 Tape Tier 1 CPU Tier 1 DiskTier 1 Tape Tier 2 CPU Tier 2 Disk
5
March 6, 2013 Ian.Bird@cern.ch5 +34 PB/yr +363 kHS06/yr
6
MSS access March 6, 2013 Ian.Bird@cern.ch6 Data written TB/month Data read TB/month
7
Data transfers March 6, 2013 Ian.Bird@cern.ch7
8
Resource use March 6, 2013 Ian.Bird@cern.ch8
9
CERN@Wigner CPU & disk servers installed - Currently: ~5000 cores; ~few PB disk All fully tested Will be put in production as part of the Agile Infrastructure - Managed by Openstack as part of lxbatch (SLC6 deployment is being done this way) - With new release of Openstack WAN connections (2x100 Gb/s) in production since February March 6, 2013 Ian.Bird@cern.ch9
10
EMI and EGI-Inspire ended CERN groups – integrated into a new team - Brings together ES and GT - Involved in: EUDAT, iMarine, CRISP, HelixNebula - Group size will decrease by 50% over one year Sections - Operations and Liaison - Monitoring Infrastructure - Information and Data Programme of work set (partly) in conjunction with experiments - Try and keep some level of flexibility to address priorities March 6, 2013 Ian.Bird@cern.ch10
11
Long term thinking Vision paper for discussion with EC and EIROforum User Forum paper – guide the direction of future infrastructures by the science communities Data preservation – see Jamie’s talk Research Data Alliance Software investment – optimise how we use CPUs/storage/networks None of this is HEP-specific! March 6, 2013 Ian.Bird@cern.ch11
12
19th May 2013 ACAT 2013; Ian.Bird@cern.ch12
13
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch13 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Managed services – operated for research communities Individual science community operated services Key principles: Governed & driven by science/research communities Business model: Operations should be self-sustaining: - Managed services are paid by use (e.g. Cloud services, data archive services, …) - Community services operated by the community at their own cost using their own resources (e.g. grids, citizen cyberscience) Software support – open source, funded by collaborating developer institutions
14
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch14 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Foundation infrastructure Networks: In Europe - Geant and NRENs - Including networking to commercial resources Federated ID management services - Unique identity for all researchers, but evolving capabilities - Eduroam/edugain - Trust federations (EUGridPMA, etc) - Single sign-on to all resources and services - Credential translation service - …
15
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch15 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Cloud facilities Cloud (IaaS, PaaS, SaaS) for research community - Could be operated by large academic data centre(s), or outsourced Only need 1 or 2 such centres If successful, once requirements understood can be operated by industry
16
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch16 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Data facilities A few large data centres providing long term archival facilities Public/open access Metadata facilities, persistent identifiers, … EUDAT as prototype of part of this service - Connection with PRACE large scale facilities, or other large science data centres
17
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch17 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment HPC facilities Make use of foundation services, and in principle many application software tools HPC science communities may need to build analysis services on cloud, grids or citizen cyberscience Archive centres could leverage existing HPC facility expertise
18
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch18 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Distributed, federated resources High level software tools and services to allow research communities to build a distributed infrastructure (grid) to integrate their own resources The community operates the infrastructure, and provides the resources - Could be a collaboration of several sciences pooling resources – i.e. the grid concept
19
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch19 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Distributed, federated resources High level software tools and services to allow research communities to build a citizen cyberscience application/portal etc. - Service could be hosted on cloud facility
20
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch20 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Software tools and services Application-level tools of general and broad use - Data management, data transfer, storage tools - … Provide a repository, brokering for finding such tools
21
An e-Infrastructure system 19th May 2013 ACAT 2013; Ian.Bird@cern.ch21 Networks, Federated ID management, etc. Grid for comm unity CCS for comm unity Application software tools and services Cloud Resource(s) Data Archives HPC Facilities Collaborative tools and services Software investment Collaborative tools and services Services that allow researchers to integrate e-infrastructure with everyday activities, personal devices, etc: - “drop box”, collaborative tools (e.g. Indico), office automation, etc. - Such services would be hosted in the cloud IaaS noted above
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.