Computing at the High Energy Physics Lab at FIT Patrick Ford, Jen Helsby, Richard Hoch, David Pena Dr. Hohlmann, Dr. Mitra.

Slides:



Advertisements
Similar presentations
1 Chapter 11: Data Centre Administration Objectives Data Centre Structure Data Centre Structure Data Centre Administration Data Centre Administration Data.
Advertisements

Simulation and Modelling of Non- Destructive Testing Methods Utilising Cosmic Ray Muon Flux Craig Stone HMS Sultan Nuclear Department.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
Techniques for detecting X-rays and gamma-rays Pair production Creation of elementary particle and its antiparticle from a photon. Occurs only if enough.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
Bondyakov A.S. Institute of Physics of ANAS, Azerbaijan JINR, Dubna.
Virtual Network Servers. What is a Server? 1. A software application that provides a specific one or more services to other computers  Example: Apache.
Quality Control Rad T 110.
Stopping Power The linear stopping power S for charged particles in a given absorber is simply defined as the differential energy loss for that particle.
Fundamentals of Networking Discovery 1, Chapter 2 Operating Systems.
We have no control on the slow and sparse incoming muon flux Visualizing sufficient scattered POCA points in target volume satisfactorily takes long time.
Grid Information Systems. Two grid information problems Two problems  Monitoring  Discovery We can use similar techniques for both.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
1 Application of multiprocessor and GRID technology in medical image processing IKTA /2002.
Sergey Ananko Saint-Petersburg State University Department of Physics
Advances in Reconstruction Algorithms for Muon Tomography R. Hoch, M. Hohlmann, D. Mitra, K. Gnanvo.
St. Petersburg State University. Department of Physics. Division of Computational Physics. COMPUTER SIMULATION OF CURRENT PRODUCED BY PULSE OF HARD RADIATION.
Influence of Virtualization on Process of Grid Application Deployment Distributed Systems Research Group Department of Computer Science AGH-UST Cracow,
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Cluster currently consists of: 1 Dell PowerEdge Ghz Dual, quad core Xeons (8 cores) and 16G of RAM Original GRIDVM - SL4 VM-Ware host 1 Dell PowerEdge.
Department of Computer Science, Graduate School of Information Science and Technology, Osaka University DCCFinder: A Very- Large Scale Code Clone Analysis.
Bright Cluster Manager Advanced cluster management made easy Dr Matthijs van Leeuwen CEO Bright Computing Mark Corcoran Director of Sales Bright Computing.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
Wenjing Wu Andrej Filipčič David Cameron Eric Lancon Claire Adam Bourdarios & others.
Muon Tomography Algorithms for Nuclear Threat Detection
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Performance Expectations for a Tomography System Using Cosmic Ray Muons and Micro Pattern Gas Detectors for the Detection of Nuclear Contraband Kondo Gnanvo,
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
Monte Carlo simulations of a first prototype micropattern gas detector system used for muon tomography J. B. Locke, K. Gnanvo, M. Hohlmann Department of.
Simulation of an MPGD application for Homeland Security Muon Tomography for detection of Nuclear contraband Kondo Gnanvo, M. Hohlmann, P. Ford, J. Helsby,
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
A Collaborative Framework for Scientific Data Analysis and Visualization Jaliya Ekanayake, Shrideep Pallickara, and Geoffrey Fox Department of Computer.
75 th Annual Meeting March 2011 Imaging with, spatial resolution of, and plans for upgrading a minimal prototype muon tomography station J. LOCKE, W. BITTNER,
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
Detector Simulation Presentation # 3 Nafisa Tasneem CHEP,KNU  How to do HEP experiment  What is detector simulation?
1 Development of a High-Throughput Computing Cluster at Florida Tech P. FORD, R. PENA, J. HELSBY, R. HOCH, M. HOHLMANN Physics and Space Sciences Dept,
Silicon Module Tests The modules are tested in the production labs. HIP is is participating in the rod quality tests at CERN. The plan of HIP CMS is to.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
U N I V E R S I T Y O F S O U T H F L O R I D A Hadoop Alternative The Hadoop Alternative Larry Moore 1, Zach Fadika 2, Dr. Madhusudhan Govindaraju 2 1.
Computer Software Types Three layers of software Operation.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
2009 Florida Academy of Sciences at Saint Leo University, Saint Leo, Florida Performance comparison of the triple gas electron multiplier (GEM) and the.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
Final Implementation of a High Performance Computing Cluster at Florida Tech P. FORD, X. FAVE, K. GNANVO, R. HOCH, M. HOHLMANN, D. MITRA Physics and Space.
Predrag Buncic (CERN/PH-SFT) Software Packaging: Can Virtualization help?
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
Detecting shielded nuclear contraband using muon tomography Judson Locke, William Bittner, Leonard Grasso, Dr. Kondo Gnanvo; Adviser: Dr. Marcus Hohlmann.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
 Matter is any thing that occupies space & has mass  Present in three states: solid, liquid, & gas  It could be divided into elements & compounds 
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
GRID & Parallel Processing Koichi Murakami11 th Geant4 Collaboration Workshop / LIP - Lisboa (10-14/Oct./2006) 1 GRID-related activity in Japan Go Iwai,
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
J. Helsby, P. Ford, R. Hoch, K. Gnanvo, R. Pena, M. Hohlmann, D. Mitra
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CLOUD COMPUTING
Grid Computing.
Composition and Operation of a Tier-3 cluster on the Open Science Grid
Patrick Dreher Research Scientist & Associate Director
Presentation transcript:

Computing at the High Energy Physics Lab at FIT Patrick Ford, Jen Helsby, Richard Hoch, David Pena Dr. Hohlmann, Dr. Mitra

Current Projects Cluster Computing - HEP’s computer cluster Grid Computing - Getting the cluster on the Open Science Grid Simulations of Particles Through Matter - Using Geant4 to model cosmic ray muons traveling through different mediums Reconstruction Algorithms - Developing algorithms to reconstruct muons passage through matter

Cluster Computing Popular high performance computing solution. A computer cluster is a group of tightly coupled computers that work together closely so that in many respects they can be viewed as though they are a single computer. Computing clusters make up over half of the top 500 most powerful computers in the world System X at Virginia Tech (12.5 Teraflops) [1]

HEP Computer Cluster Equipment loaned by University of Florida Started with 10 Dual CPU Intel Pentium 1.0 GHz servers One server for front-end, nine for nodes Uses networked attached storage (NAS) Cascaded switches for expandability and redundancy

Cluster Topology

Current and Future Status Currently the original front-end is still being used, but the cluster has expanded to 30 nodes Uses a high-end managed switch as the hub of network and cascades to unmanaged switches with 10 nodes each Future expansion will include high-end compute nodes, a ~10TB NAS, and a better front-end MAGNUM XV3045 NAS [2]

HEP COMPUTER CLUSTER Newest NodesNAS 1 and 2

Rocks Open-source Linux cluster distribution Enables end users to easily build computational clusters [3]

Rocks Kickstart Graph

Networked Attached Storage Also uses Rocks Uses RAID 5 - Faster writing. Each hard drive needs to write only 1/3 of the data - Efficiency increases as number of hard drives increases - Fault tolerance. If any one hard drive fails, the data on that drive can be reconstructed using the data from the other two drives.

Condor Software that enables us to distribute the load of a computing task over all the CPUs in the cluster This type of software is called a batch job system Well suited for grid computing, as it is able to submit jobs to machines located all over the world

Grid Computing A collection of networks, software, and computers intended for shared use by organizations of people Resources are managed by a grid Users run applications as needed without worrying about where the computers are Well-suited to organizations that consist of a large number of geographically distributed members, all working on a common project, and who require shared computing resources in order to accomplish their work

Grid Layers I. Network layer - Underlying connectivity II. The grid's resources - data storage, databases, software repositories, and even sensors III. The middleware, or "brains" of the grid - does all the work to connect users' jobs to computing resources IV. Application layer - diverse layer, as it includes virtually any program an end user wishes to run

Open Science Grid A distributed computing infrastructure that is used for large-scale scientific work Used by many universities, laboratories, and software developers Backed by the NSF and the U.S. Department of Energy's Office of Science The OSG Consortium builds and operates the OSG project, with the goal of giving scientists from many fields access to shared resources worldwide

Science on OSG Scientists from many fields use OSG: particle and nuclear physics, astrophysics, bioinformatics, gravitational-wave science and computer science collaborations [4]

Getting On OSG Need the third layer, the middleware OSG’s is based on the Virtual Data Toolkit (VDT) Installation package is needed, called Pacman First installed Integration Test Bed (ITB) client and then the Compute Element (CE) package

Getting On OSG (cont.) Interfacing Globus and Condor Installing additional packages: Managed Fork, MonaLisa, other monitoring services Getting personal and host certificates, and the Certificate Authority (CA) list Testing and debugging the install Registering with OSG

Success… partially The Integration Test Bed Map

Particle Simulations Geant4 provides a toolkit that enables modeling of many different particles through matter Much data can be extracted from these simulations Our focus is on the simulation of cosmic ray muons traveling through different mediums. Why?

Muon Tomograpy Outgrowth of muon and proton radiography Provides a new way to detect threats such as nuclear weapons or fissionable material, and other terrorist threats (artillery shells, IED’s, etc.) Why muons?

Why Muons? Relatively large elementary particles and travel at relativistic speeds, can penetrate tens of meters into rocks and other matter before attenuating as a result of absorption or deflection by other atoms All natural occurring muons on Earth are due to cosmic rays One per cm^2 per minute Muons are deflected by coulomb scattering, dependent on the atomic number of the material Benefits over other techniques Muons more penetrating than gamma rays No extra radiation dose Fewer false alarms [5]

How Does It Work?

Geant4 Free tool that can run on Windows, Linux, and MAC OS X Current version written in C++; former versions written in Fortran Developed and maintained by the Geant4 collaboration which has over 100 members worldwide [6]

Scope of Geant4 the geometry of the system (e.g. a box) the materials involved (e.g. Pb, U, etc.) the fundamental particles of interest (e.g. electrons, muons, etc.) the physics processes governing particle interactions the generation of event data the storage of events and tracks the visualization of the detector and particle trajectories the capture and analysis of simulation data at different levels of detail and refinement

First Scenario Created a 50x50x50 cm^3 lead block in an argon atmosphere Bombarded it with 3GeV muons Interactions included: muons, electrons - ionization, knock-on electrons, multiple scattering photons – absorption via photoelectric effect, Compton scattering, pair production

Second Scenario Interfaced Cosmic RaY (CRY) to simulate cosmic ray muon Used same lead box as material to detect, but added detectors made of G10 Material  Blue – Positively Charged Muons  Red – Negatively Charged Muons  Green - Photons

Future Scenarios Adding more detectors Simulating a truck carrying plywood Hidden in the cargo area, and in the engine block are small blocks of uranium Problems: - Small amounts of high-z material harder to detect - Engine block contains high-z material so multiple scattering will occur

Reconstruction Algorithms Produces a 3D image from the projection For Muon Tomography two have been prominently used: - Point of Closest Approach - Maximum Likelihood Implemented the POCA algorithm

Point of Closest Approach Take two lines, L1 and L2 Take w(s,t) = L1(s) – L2(t) L1 and L2 closest when w(s,t) is a minimum, and this vector is perpendicular to these points, meaning w·v=0 and w·u=0 Solve these by substituting w = L1(s)-L2(t) = w0 + su - tv, where w0 = P0-Q0 we get two linear equations: (u · u)s-(u · v)t=-u · w0 and (u · v)s-(v · v)t=-v · w0 If we set a = (u · u), b = (u · v), c = (v · v), d = u · w0 and e = v · wo and then solve for s and t we get the equations: s = be-cd/(ac-b^2) and t = ae-bd/(ac-b^2)

Using POCA to find Scatter Points

3D Imaging Also have the scattering angle of the muon A large scattering angle indicates high Z- object When plotted, points will be assigned colors according to scattering angle [7]

Future Work Expanding the Cluster - High-end nodes, NAS, and front-end Becoming a fully functional site on OSG Modeling more detailed scenarios using Geant4 Improving POCA, and implementing a Maximum Likelihood algorithm Using real world data

References 1) 2) 3) 4) SG/Currently_Running_Applicationshttp:// SG/Currently_Running_Applications 5) 6) 7) rozdin-2004-information.pdfhttp://math.lanl.gov/Research/Publications/Docs/bo rozdin-2004-information.pdf