The science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests.

Slides:



Advertisements
Similar presentations
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
National e-Science Centre Glasgow e-Science Hub Opening: Remarks NeSCs Role Prof. Malcolm Atkinson Director 17 th September 2003.
HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
LEIT (ICT7 + ICT8): Cloud strategy - Cloud R&I: Heterogeneous cloud infrastructures, federated cloud networking; cloud innovation platforms; - PCP for.
The Role of Environmental Monitoring in the Green Economy Strategy K Nathan Hill March 2010.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
High-Performance Computing
SCD in Horizon 2020 Ian Collier RAL Tier 1 GridPP 33, Ambleside, August 22 nd 2014.
Richard Kenway Everything is a computer Richard Kenway.
Parallel Programming Henri Bal Rob van Nieuwpoort Vrije Universiteit Amsterdam Faculty of Sciences.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Parallel Programming Henri Bal Vrije Universiteit Amsterdam Faculty of Sciences.
Results Matter. Trust NAG. Numerical Algorithms Group Mathematics and technology for optimized performance Andrew Jones IDC HPC User Forum, Imperial College.
1Training & Education at EPCC Training and Education at EPCC Judy Hardy
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
CEN/ISSS DC workshop, January The UK approach to subject gateways Rachel Heery UKOLN University of Bath UKOLN is.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC WP2+5: Data and Storage Management.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
ILDG5QCDgrid1 QCDgrid status report UKQCD data grid Chris Maynard.
QCDgrid UKQCD Achievements and Future Priorities Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan.
Simulating Quarks and Gluons with Quantum Chromodynamics February 10, CS635 Parallel Computer Architecture. Mahantesh Halappanavar.
1 European policies for e- Infrastructures Belarus-Poland NREN cross-border link inauguration event Minsk, 9 November 2010 Jean-Luc Dorel European Commission.
QCD Project Overview Ying Zhang September 26, 2005.
INFSO-RI Enabling Grids for E-sciencE EGEODE VO « Expanding GEosciences On DEmand » Geocluster©: Generic Seismic Processing Platform.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield.
CRISP & SKA WP19 Status. Overview Staffing SKA Preconstruction phase Tiered Data Delivery Infrastructure Prototype deployment.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
SciDAC Software Infrastructure for Lattice Gauge Theory Richard C. Brower QCD Project Review May 24-25, 2005 Code distribution see
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
Template This is a template to help, not constrain, you. Modify as appropriate. Move bullet points to additional slides as needed. Don’t cram onto a single.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
SURA BOT 11/5/02 Lattice QCD Stephen J Wallace. SURA BOT 11/5/02 Lattice.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
J.-N. Leboeuf V.K. Decyk R.E. Waltz J. Candy W. Dorland Z. Lin S. Parker Y. Chen W.M. Nevins B.I. Cohen A.M. Dimits D. Shumaker W.W. Lee S. Ethier J. Lewandowski.
A QCD Grid: 5 Easy Pieces? Richard Kenway University of Edinburgh.
Template This is a template to help, not constrain, you. Modify as appropriate. Move bullet points to additional slides as needed. Don’t cram onto a single.
Status and plans at KEK Shoji Hashimoto Workshop on LQCD Software for Blue Gene/L, Boston University, Jan. 27, 2006.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
May 2005 PPARC e-Science PG School1 QCDgrid Chris Maynard A Grid for UKQCD National collaboration for lattice QCD.
Professor Arthur Trew Director, EPCC EPCC: KT in novel computing.
Parallel Libraries on BlueGene/L Lorna Smith EPCC
UKQCD NeSCAC Irving, 24/1/061 January 06 UKQCD meeting Staggered fermion project Alan Irving University of Liverpool.
Societal applications of large scalable parallel computing systems ARTEMIS & ITEA Co-summit, Madrid, October 30th 2009.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
LQCD Computing Project Overview
Gavin McCance University of Glasgow GridPP2 Workshop, UCL
National e-Infrastructure Vision
Fabric and Storage Management
ICT NCP Infoday Brussels, 23 June 2010
An Introduction to Software Engineering
Scheduled Accomplishments
Presentation transcript:

the science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests

tba 2 degenerate flavours quenched QCDOC T3E T3D i860 DAP Moore’s Law 1 Mflops 1 Gflops 1 Tflops 1 Pflops UKQCD UKQCD, the first UK Grand Challenge commercial computers optimised codes  competitive edge effort focused on exploitation commercial computers optimised codes  competitive edge effort focused on exploitation

has it been a success? yes …excellent value for money –machines, codes and data have been fully exploited but we had to concentrate on one action and one parameter range –zero temperature O(a)-improved Wilson quarks and we were dependent on purchasing machines off the shelf –first the UK parallel computing industry disappeared –now the HPC market is shrinking world wide yes …excellent value for money –machines, codes and data have been fully exploited but we had to concentrate on one action and one parameter range –zero temperature O(a)-improved Wilson quarks and we were dependent on purchasing machines off the shelf –first the UK parallel computing industry disappeared –now the HPC market is shrinking world wide

UKQCD international partnerships DOE+PPARC+RIKEN: QCDOC –low power ASIC using IBM embedded processor technology to scale to 10 4 nodes DOE SciDAC –standard APIs for portability made a prerequisite for large machine funding –linked to UKQCD via QCDOC open source policy PPARC GridPP –QCD data grid conforming to emerging EDG + OGSA standards –linked to SciDAC via QCDOC DOE+PPARC+RIKEN: QCDOC –low power ASIC using IBM embedded processor technology to scale to 10 4 nodes DOE SciDAC –standard APIs for portability made a prerequisite for large machine funding –linked to UKQCD via QCDOC open source policy PPARC GridPP –QCD data grid conforming to emerging EDG + OGSA standards –linked to SciDAC via QCDOC

the UK QCDgrid “QCDOC” OGSA metadata catalogue Tier 0 TestBed 1 Tier 1 Tier 2  US DOE SciDAC physics search LFN physics history software LFN data LFN data Edinburgh Swansea Liverpool Glasgow

the next step there is a robust software strategy –open source codes will provide choice the grid will provide tools for data integration –OGSA standard will allow interoperability but who will build and operate machines? –we are dependent on a few individuals and face rapidly changing market forces there is a robust software strategy –open source codes will provide choice the grid will provide tools for data integration –OGSA standard will allow interoperability but who will build and operate machines? –we are dependent on a few individuals and face rapidly changing market forces to encourage machine builders and make codes + data available to all to encourage machine builders and make codes + data available to all the challenge

International Particle Simulation Organisation Purpose –to generate and manage field theory configurations Funding –annual subscriptions based on community size and value of total data held –subscriptions paid with data and/or money –data value decreases according to Moore’s law Council –decides actions and parameters –manages the grid –buys data, providing funding for machines Users –from member states may access all data Purpose –to generate and manage field theory configurations Funding –annual subscriptions based on community size and value of total data held –subscriptions paid with data and/or money –data value decreases according to Moore’s law Council –decides actions and parameters –manages the grid –buys data, providing funding for machines Users –from member states may access all data