High Energy Physics Computing Coordination in Pakistan

Slides:



Advertisements
Similar presentations
Kejun Dong, Kai Nan CNIC/CAS CNIC Resources And Activities Update Resources Working Group PRAGMA11 Workshop, Oct.16/17 Osaka, Japan.
Advertisements

Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
An overview of the EGEE project Bob Jones EGEE Technical Director DTI International Technology Service-GlobalWatch Mission CERN – June 2004.
Summary Role of Software (1 slide) ARCS Software Architecture (4 slides) SNS -- Caltech Interactions (3 slides)
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
COMSATS Institute of Information Technology, Islamabad PK-CIIT Grid Operations in Pakistan COMSATS Ali Zahir Site Admin / Faculty Member ALICE T1/2 March.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
LCG Introduction John Gordon, STFC-RAL GDB September 9 th, 2008.
COMSATS Institute of Information Technology, Islamabad PK-CIIT Grid Operations in Pakistan COMSATS Dr. Saif-ur-Rehman Muhammad Waqar Asia Tier Center Forum.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
15-Dec-04D.P.Kelsey, LCG-GDB-Security1 LCG/GDB Security Update (Report from the Joint Security Policy Group) CERN 15 December 2004 David Kelsey CCLRC/RAL,
A. Hoummada May Korea Moroccan ATLAS GRID MAGRID Abdeslam Hoummada University HASSAN II Ain Chock B.P Maarif CASABLANCA - MOROCCO National.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
EC Review – 01/03/2002 – WP9 – Earth Observation Applications – n° 1 WP9 Earth Observation Applications 1st Annual Review Report to the EU ESA, KNMI, IPSL,
INFSO-RI Enabling Grids for E-sciencE An overview of EGEE operations & support procedures Jules Wolfrat SARA.
Final Implementation of a High Performance Computing Cluster at Florida Tech P. FORD, X. FAVE, K. GNANVO, R. HOCH, M. HOHLMANN, D. MITRA Physics and Space.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
Advanced User Support in the Swedish National HPC Infrastructure May 13, 2013NeIC Workshop: Center Operations best practices.
Pentti Pulkkinen Programme Manager Academy of Finland Research funding and administration in Finland
LCG Introduction John Gordon, STFC-RAL GDB June 11 th, 2008.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
GRID & Parallel Processing Koichi Murakami11 th Geant4 Collaboration Workshop / LIP - Lisboa (10-14/Oct./2006) 1 GRID-related activity in Japan Go Iwai,
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
Grid Computing at NIKHEF Shipping High-Energy Physics data, be it simulated or measured, required strong national and trans-Atlantic.
GridMaGrid Users & Applications Conclusions 16/ Grid activities in Morocco Abderrahman El Kharrim CNRST - MaGrid Team Morocco Grid Workshop - Rabat,
Bob Jones EGEE Technical Director
AMICI WP1 – Management, coordination and dissemination
Accessing the VI-SEEM infrastructure
Regional Operations Centres Core infrastructure Centres
LCG Service Challenge: Planning and Milestones
Grid site as a tool for data processing and data analysis
David Kelsey CCLRC/RAL, UK
JRA3 Introduction Åke Edlund EGEE Security Head
Ian Bird GDB Meeting CERN 9 September 2003
LCG Deployment in Japan
Grid Computing for the ILC
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
LHC DATA ANALYSIS INFN (LNL – PADOVA)
UTFSM computer cluster
US CMS Testbed.
Connecting the European Grid Infrastructure to Research Communities
LHC Data Analysis using a worldwide computing grid
EGI Webinar - Introduction -
Collaboration Board Meeting
NATIONAL CENTRE FOR PHYSICS PK-Grid-CA
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

High Energy Physics Computing Coordination in Pakistan Prof. Riazuddin Director General National Centre For Physics

National Centre for Physics, http://www.ncp.edu.pk Computing at NCP One of the important activities at NCP is the Experimental High Energy Physics Program. Involved in LHC collaborating with CMS. It entails: High speed networks, Powerful data processing facilities, Data storages, and Services & tools required to handle the data. NCP – National Centre for Physics LHC – Large Hadron Collider CMS – Compact Muon Solenoid 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

Advanced Scientific Computing In August 2003, Advanced Scientific Computing (ASC) Group was established in NCP, with following aims: Deploy and create high performance computing facilities. Establish the hardware and software user environments in high performance computing. Provide technical consultation service to all users. Create an International Research Base. Educate and train our young professionals. Promote skilled personnel at international forums. ASC – Advanced Scientific Computing Group 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

Accomplishments – Grid Comp. CMS Production Setup 08 CPUs dedicated. LCG at NCP Certified and tested LCG Grid node. Certification Authority (CA) Accredited Certification Authority by EU-Grid-PMA. LCG – LHC (Large Hadron Collider) Computing Grid CA – Certification Authority 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk Accomplishments Linux Cluster at NCP 10 CPUs dedicated. 50 in near future and 200 in two years. Storage space currently 0.6 TB. 5 TB in near future and 30 TB in two years. Access Control. Efficient job scheduling. TB – Tera Byte 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk CMS Production at NCP NCP is taking part in CMS Production since Spring 2002. NCP became Regional Center for CMS Production in Pakistan in August 2003. Total events produced by Pakistan 1.38M. NCP’s share in these produced events is 0.956M. Total data generated corresponding to these events is almost 280GB. Data produced by NCP is almost 170GB. All 280GB has been uploaded back to CERN. 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

CMS Production Centres in PAK Currently there are five CMS Production Centres in Pakistan NCP (Islamabad) In the role of Regional & Production Centre as well PAEC-I (Islamabad) PAEC-II (Karachi) PAEC-III (Taxila) NUST (Rawalpindi) 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

CMS Production Centres in PAK No single institute has resources to become a Tier-1 or Tier-2 Centre The concept of Federation of Production Centres was adopted PCs distributed across Pakistan contributing to CMS Production with small local farms Now NCP and PAEC-I are also contributing to it through LCG Grid 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk CMS Production at NCP Resources (CPUs+Storage) dedicated to CMS by NCP and other PCs are: Production Centre CPUs Storage (TB) NCP 08 0.64 NUST 0.60 PAEC-I 14 0.32 PAEC-II 0.952 PAEC-III 05 0.38 TOTAL 49 2.892 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

Events Produced (Summary) Production Centre GEN. K SIM. Digi w/o PU Digi with PU/Hit NCP 715.7 204.7 10 21.8/4.5 NUST 50.1 20.1 -- PAEC-I 93.7 53.7 35/-- PAEC-II 30.1 10.1 PAEC-III 34.2 44.2 10/4 TOTAL 930.30 332.80 30 66.8/9.5 GEN – Generation (CMKIN) SIM – Simulation (OSCAR) Digi – Digitization PU – Pileup data 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

Events Produced (Summary) NCP PAEC- I PAEC- II PAEC- III NUST Test 338.7K 38.9K 40.2K 86.4K 70.2K Real 618K 153.5K -- 16K 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk Production at LCG Events Assigned (Million) Produced & Updated CMKIN 7.0 3.3 OSCAR 5.14 Started Running 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk LCG at NCP The effort to bring Pakistan on the LCG map as a Grid Node was started in October 2003. A Grid Technology Workshop was organized by NCP from October 20-22, 2003. The first ever testbed was deployed during the workshop for tutorial. LCG1 tag 1.1.1.2 was used for deployment. This testbed consisted of 09 machines. CE(2), SE, RB, WN(3), UI, GIIS. LCG – LHC Computing Grid CE – Computing Element SE – Storage Element RB – Resource Broker WN – Worker Node UI – User Interface GIIS – Grid Index Information Server 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk LCG at NCP 30 machines were used during the tutorial for enabling users to communicate with the deployed testbed. The effort continued and NCP deployed a new LCG version which is LCG2 tag 2.0.0 in June 2004. In September 2004 we moved to LCG2 tag 2.2.0. In January 2005 we installed LCG tag 2.3.0 and finally in March we updated it to 2.3.1. Now NCP is a Tested & Certified Grid Node in Pakistan. 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk LCG at NCP Tested & Certified by the Grid Deployment Team (dteam) at CERN and added to the Grid Operations Centre (GOC) website. A certified LCG Grid node, first in South Asia and fifth in Asia. 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk LCG Map 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk LCG at NCP Services currently deployed by NCP for LCG2_3_0 are: UI, SE, CE, and WN. Following services are being used from CERN machines: RB, BDII, and Proxy Server. Reason for using CERN machines is low bandwidth. Services like RB and BDII require high bandwidth. BDII – Berkley Database Information Index Server 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk PK-GRID-CA For becoming fully operational, a Grid Node is also required to be a Certification Authority (CA). Which issues digital certificates to users/hosts to use grid resources under secure environment. The effort in this regard started in October 2003. NCP produced the first Certificate Policy and Certification Practice Statement (CP-CPS) document in December 2003. Reviewed by several members of European Grid Policy Management Authority (EU-Grid-PMA). CP-CPS -- Certificate Policy and Certification Practice Statement EU-Grid-PMA – European Grid Policy Management Authority 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk PK-GRID-CA Three revisions were made which resulted from comments and suggestions by PMA members. The CA was presented in September in the 2nd meeting of the EU-Grid-PMA held in Brussels. NCP was formally approved by the EU-Grid-PMA as a Certification Authority. PK-Grid-CA had started operations since then. First Certification Authority in Pakistan. For more information: www.ncp.edu.pk/pk-grid-ca 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk Bottlenecks More coordination is needed High speed network Non-availability of skilled manpower 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

National Centre for Physics, http://www.ncp.edu.pk Future Plan Wide Area Network Further expansion in computing resources Computational Storage Bandwidth Grid Software Development group Improvement in Performance & Reliability in Grid Self Healing System Use of Grid for other disciplines 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk

HEC, Pakistan Research Support Programmes HEC: Higher Education Commission funding programs: International Collaborative Research Initiation Grant Program (ICRIG) Research Grant Program ($35K to 100K) University – Industry Technology Support Program ($130K) 80/20 Rule Pakistan – US Science and Technology Cooperative Program Training Program for Technical / Scientific Staff and Researchers Grants for Organizing Seminars, Conferences, Workshops Travel Grant to University Teachers Foreign Faculty Hiring Program (3months to 5 years) www.hec.gov.pk 11/17/2018 National Centre for Physics, http://www.ncp.edu.pk