Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, 2006 1 The Australian National Grid Program “providing advanced computing, information.

Slides:



Advertisements
Similar presentations
AICTEC 30 November February 2002 East-West Center University of Hawaii, Honolulu Australias Involvement and Plans George McLaughlin.
Advertisements

Australian Partnership for Advanced Computing Partners: Australian Centre for Advanced Computing and Communications (ac3) in New South Wales CSIRO Queensland.
Australian e-Astronomy US-Australia Workshop on High-Performance Grids and Applications 8-9 June 2004 David Barnes Research Fellow, School of Physics The.
The Australian Virtual Observatory e-Science Meeting School of Physics, March 2003 David Barnes.
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Activities in the ANU Supercomputer Facility ANUSF
Integrated Environment for Computational Chemistry on the APAC Grid Dr. Vladislav Vassiliev Supercomputer Facility, The Australian National University,
QCloud Queensland Cloud Data Storage and Services 27Mar2012 QCloud1.
APAC Grid Geosciences Project: Linking Academia, Industry and Government to HPC Robert Woodcock, CSIRO and pmd*CRC Ryan Fraser, IVEC and Curtin.
High Performance Computing Course Notes Grid Computing.
An Overview of eResearch Activities in Australia Paul Davis, GrangeNet Jane Hunter, Uni of Qld.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
Australian Partnership for Advanced Computing “providing advanced computing and grid infrastructure for eResearch” Rhys Francis Manager, APAC grid program.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
TPAC Tasmanian Partnership for Advanced Computing Partner in APAC (Australian Partnership for Advanced Computing) Expertise centre for Earth Systems Science.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
1 e Research Presented by: Mohsen Kahani Computer Engineering Department Ferdowsi University of Mashhad هفته پژوهش – آذر.
Aus-VO: Progress in the Australian Virtual Observatory Tara Murphy Australia Telescope National Facility.
TPAC Digital Library Talk Overview Presenter:Glenn Hyland Tasmanian Partnership for Advanced Computing & Australian Antarctic Division Outline: TPAC Overview.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
Grid Information Systems. Two grid information problems Two problems  Monitoring  Discovery We can use similar techniques for both.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Cluster currently consists of: 1 Dell PowerEdge Ghz Dual, quad core Xeons (8 cores) and 16G of RAM Original GRIDVM - SL4 VM-Ware host 1 Dell PowerEdge.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
© Copyright AARNet Pty Ltd AARNet 3 George McLaughlin Director, International Developments.
SEE-GRID-SCI Regional Grid Infrastructure: Resource for e-Science Regional eInfrastructure development and results IT’10, Zabljak,
International Task Force Meeting 6 May 2002 Australian Update and Overview George McLaughlin.
L ABORATÓRIO DE INSTRUMENTAÇÃO EM FÍSICA EXPERIMENTAL DE PARTÍCULAS Enabling Grids for E-sciencE Grid Computing: Running your Jobs around the World.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Neil Witheridge APAN29 Sydney February 2010 ARCS Authorisation Services Neil Witheridge Manager, ARCS Authorisation Services APAN29, Sydney, February 2010.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Crystal25 Hunter Valley, Australia, 11 April 2007 Crystal25 Hunter Valley, Australia, 11 April 2007 JAINIS (JCU and Indiana Instrument Services): A Grid.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
© 2006 Open Grid Forum Enabling Pervasive Grids The OGF GIN Effort Erwin Laure GIN-CG co-chair, EGEE Technical Director
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
TPAC Tasmanian Partnership for Advanced Computing Partner in APAC (Australian Partnership for Advanced Computing) Expertise centre for Earth Systems Science.
© Copyright AARNet Pty Ltd TEIN APII Koren Symposium Australia and the Pacific Region Infrastructure to support global e-science George McLaughlin Director,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
| nectar.org.au NECTAR TRAINING Module 2 Virtual Laboratories and eResearch Tools.
1 Paul Davis. 2 gr id a nd n ext ge neration net work.
AN ORGANISATION FOR A NATIONAL EARTH SCIENCE INFRASTRUCTURE PROGRAM “Building Clients for the AuScope Spatial Information Services Stack (SiSS)” AuScope.
Switzerland joining EGEE 7 March, 2005, GSI- Darmstadt by CSCS, the Swiss National Supercomputing Centre.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
GRID & Parallel Processing Koichi Murakami11 th Geant4 Collaboration Workshop / LIP - Lisboa (10-14/Oct./2006) 1 GRID-related activity in Japan Go Iwai,
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Grids and SMEs: Experience and Perspectives Emanouil Atanassov, Todor Gurov, and Aneta Karaivanova Institute for Parallel Processing, Bulgarian Academy.
Advanced Computing and Grid Infrastructure in Australia ISGC April 2005 Taiwan John O’Callaghan Executive Director Australian Partnership for.
Bob Jones EGEE Technical Director
Grid Computing: Running your Jobs around the World
EGEE support for HEP and other applications
grid and next generation network
Australia's National Information Infrastructure for Research Markus Buchhorn Director, ICT Environments, The Australian National University (and APAC,
Presentation transcript:

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, The Australian National Grid Program “providing advanced computing, information and grid infrastructure for eResearch” Glenn Moloney University of Melbourne

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Darwin APAC National Grid GrangeNet Backbone Centie/GrangeNet Link AARNet Links Internet2 Canarie Geant APAN APAC National Facility Brisbane QPSF Canberra ANU Melbourne VPAC CSIRO Sydney ac3 Perth IVEC CSIRO Adelaide SAPAC Hobart TPAC CSIRO 10 Gbps IPv6 Multicast

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Australian Partnership for Advanced Computing The APAC Partners: AC3: Australian Centre for Advanced Computing and Communications in NSW CSIRO: Commonwealth Science and Industry Research Organisation QPSF: Queensland Parallel Supercomputing Foundation IVEC: Interactive Virtual Environments Centre in WA SAPAC: South Australian Partnership for Advanced Computing ANUSF: The Australian National University TPAC: The University of Tasmania VPAC: Victorian Partnership for Advanced Computing “providing advanced computing, information and grid infrastructure for eResearch”

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, National Role of APAC Advanced Computing Infrastructure – Peak computing facilities Information Infrastructure – Support for community-based data collections – Management of large-scale data collections (archiving) Grid Infrastructure – Access to national computing and information infrastructure – Advanced collaborative services for research groups collaborative visualisation, computational steering, tele-presence, virtual organisation support – Support Australian participation in international research programs eg. astronomy, high-energy physics, earth systems, geosciences

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, APAC 2: The APAC Grid Program Australian government provided AU$29m for stage 2 of APAC: Providing the advanced computing and grid infrastructure for eresearch · AU$12.5m for upgrade of National Facility Canberra · Commisioned mid 2005 · National grid infrastructure projects: · Computing infrastructure · Information infrastructure · User Interface and Visualisation · Application support projects: · Astronomy (Virtual Observatory) · Computational chemistry · Theoretical and experimental high energy physics · International Lattice Data Grid, ATLAS, Belle · Geosciences · Bioinformatics

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, APAC National Facility Usage mainly biology, chemistry, physics currently 247 projects and 722 users (27 universities) Computing Systems SGI Altix 3700 Bx2 system: 1680 processors Dell Linux cluster: 150 processors Mass Data Storage System (MDSS) Storagetek (robotic silo) HSM tape library –Petabyte capable storage Visualisation Systems Virtual reality systems, Access Grid rooms Staff User support, Systems support Computational tools and techniques Large-scale data collection management

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Global Connectivity 10Gbps ring

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, APAC Grid Deployment APAC National Grid.v1 – Single Sign-on, data sharing Base: VDT (GT2.4.3, Monalisa, Ganglia), GridSphere, SRB, OpenDAP, Nimrod, LCG VO model: follow Grid3 Use APAC CA Manually configured solutions APAC National Grid.v2 – Add portals and workflow support Base: VDT-> GT4, Gridsphere, SRB OpenDAP, Nimrod, LCG VO Model: not yet determined Use National CAs Auto configuration APAC National Grid.v3 Interoperability: Align with OSG, EGEE Use aarnet3 backbone

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, APAC Grid Gatekeeper Machines Each partner site has a 'gateway' machine which 'hosts' Grid front-ends to the available resources Xen Virtual Machine Monitor University of Cambride Computer Laboratory Hardware: Dual Xeon 2.8GHz, 4Gb RAM, 300Gb mirrored SCSI disk, 5 GigE network cards (1 mgmt, 2 data VM, 2 other VM's) Grid front-ends: Globus 2 (VDT-1.2.4), Globus 4 (VDT-1.4 ??), Glite3 Storage Resource Broker 3.3.1, Nimrod/G Physical Hardware CPU, disk, network Linux (2.6) dom0 Xen hypervis or VM (domU ) VM (domU ) VM (domU )

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, QPSF ANU VPAC ac3 TPAC CSIRO Data Transfer: RFT GridFTP Global File System Data Management: Globus SRB SRM Glite Data Access: OGSA-DAI Web services OPenDAP Mass Data Storage Systems: Tape – based (silos) Disc-based IVEC SAPAC APAC National Facility APAC National Grid Data Management Infrastructure QPSF (JCU)

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Delivering National Grid Services Other Grids: Institutional National International Other Grids: Institutional National International Data Centres Instruments Sensor Networks Research Teams grid-based portals distributed computation federated data access remote control collaboratories

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Astronomy and Astrophysics MACHO Project Data –Largest online astro data set in Australia (~10TB) –Hosted by APAC as part of IVO collection –Mapping metadata to VOTable 1.0 standard Australian Virtual Observatory –Provide uniform access to key data collections 2dFGRS, HIPASS, ATCA-OA, SUMSS, MACHO, TNO… –Grids for theoretical astrophysics simulations Portals for job configuration, submission and monitoring MLAPM, GCD+, Zeus-MP, LensView, (x)oopic, Swift, International Virtual Observatory –SIAP service for ATCA Phoenix Deep Field Survey SIAP is an International Virtual Observatory protocol

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Bioinformatics Accelerate progress on genome annotation, for genomes of national economic significance Support lead discovery through molecular docking Data update and synchronisation services, including the BioMirror Grid-wide compute services for Ensembl, Blast, RepeatMasker and Glimmer Grid-wide compute services for molecular docking including support for analysis workflows

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, VPAC QPSF TPAC IVEC APAC NATIONAL FACILITY ANU CSIRO SAPAC AC3 Computational Chemistry Unified Grid-based portal to chemistry software Portal to computational chemistry software on APAC Grid Uniform access to software on a computer system Gaussian, Amber, Gamess-US, Gromacs, Mopac and Molpro

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Earth Systems Science Access to Data Products Inter-governmental Panel on Climate Change scenarios of future climate (3TB) Ocean Colour Products of Australasian and Antarctic region (10TB) 1/8 degree ocean simulations (4TB) Weather research products (4TB) Earth Systems Simulations Terrestrial Land Surface Data Grid Services –Globus based version of OPeNDAP (UCAR/NCAR/URI) –Server side analysis tools for data sets: GRADS, NOMADS –Client side visualisation from on-line servers –THREDDS (catalogues of OPeNDAP repositories)

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Geosciences Develop systems that support the real-time steering of complex geoscience analysis This requires: Workflow support for mantle convection modelling with components running on distributed grid resources Portlets for compute services including ‘snark’ and ‘Finley’ Hypothesis exploration through real-time ensemble management

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, High-Energy Particle Physics Belle Physics Collaboration K.E.K. B-factory detector –Tsukuba, Japan Matter/Anti-matter asymmetry in B meson decays 45 Institutions, 400 users worldwide –~1 PB data currently Australian grid for Belle: Simulation and Data analysis –Data grid centred on APAC National Facility Atlas Experiment Large Hadron Collider (LHC) at CERN –Operational in 2007 Deploying EGEE infrastructure on APAC Grid –WLCG Tier 2 at University of Melbourne

APAC National Grid Status Core services installed –Core services implemented APAC CA and myproxy, VOMRS, GT2 First applications in operational status –Some applications close to ‘production’ mode Geosciences, HEP (Belle experiment) Systems coverage –Users can access ALL systems at APAC partners –About 4600 processors and 100’s of Tbytes of disk –Around 3Pbytes of disk cached HSM systems Extension of the Grid –Requests for service are spreading to multiple sites leading to an affiliate model

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, The APAC Grid Program The APAC grid program has been active in deploying a grid infrastructure in Australia Focussed on needs of Application Projects Interoperability – must work closely with international grids Tyranny of distance is being tamed: high bandwidth international connections But – we need to do more: Improved international collaboration more efficient deployment Operations: we are just beginning

Glenn MoloneyThe Australian National Grid ProgramEGEE'06, Geneva, Looking Forward... APAC 3 funding in 2007: Interoperability: –Expand engagement with GGF inter-operability activities Operations: –Establish APAC Grid Operations Centre –Improve distributed team management Expand user community: –All users of APAC facilities are Grid Users –Data Management Expand infrastructure to include: –Major data centres: eg. University data repositories –Facilties: Telescopes, Synchrotron, ANSTO,... Proposal to deploy Glite infrastructure across APAC facilities: –To support HEP: Australian Tier 2 Federation –Explore other user communities: collaborations with Europe and Asia