GlueX Computing GlueX Collaboration Meeting – JLab Edward Brash – University of Regina December 11 th -13th, 2003.

Slides:



Advertisements
Similar presentations
The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Advertisements

1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
XSEDE 13 July 24, Galaxy Team: PSC Team:
GlueX Computing GlueX Collaboration Meeting – Glasgow Edward Brash – University of Regina August 4 th, 2003.
Software Project Brent Fultz California Institute of Technology Issues Specifications Algorithms Web service model Plan for a plan.
August 25, 2003 Richard Jones, Prof of Physics, University of Connecticut 1 project manager client browser web server GUI applet project 1 project 2 project.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
ROOT An object oriented HEP analysis framework.. Computing in Physics Physics = experimental science =>Experiments (e.g. at CERN) Planning phase Physics.
WorkPlace Pro Utilities.
OpenAlea An OpenSource platform for plant modeling C. Pradal, S. Dufour-Kowalski, F. Boudon, C. Fournier, C. Godin.
CEC Online Informational, Interactive Platform on Climate Change Outreach Presentation (June 2013) 1 Project conducted by Open Organize.
Introduction to Hall-D Software February 27, 2009 David Lawrence - JLab.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer.
Simulations Progress at Regina ➔ Event generation with genr8 – output in ascii format ➔ Conversion to either HDFast input (stdhep) or HDGeant input (hddm)
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
1 Apache. 2 Module - Apache ♦ Overview This module focuses on configuring and customizing Apache web server. Apache is a commonly used Hypertext Transfer.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
HPS Online Software Discussion Jeremy McCormick, SLAC Status and Plans.
Workfest Goals Develop the Tools for CDR Simulations HDFast HDGEANT Geometry Definitions Remote Access Education of the rest of the collaboration Needs.
Scalable Systems Software Center Resource Management and Accounting Working Group Face-to-Face Meeting October 10-11, 2002.
National Center for Supercomputing Applications NCSA OPIE Presentation November 2000.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
GlueX Software Status April 28, 2006 David Lawrence, JLab.
1 Schema Registries Steven Hughes, Lou Reich, Dan Crichton NASA 21 October 2015.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
NOVA Networked Object-based EnVironment for Analysis P. Nevski, A. Vaniachine, T. Wenaus NOVA is a project to develop distributed object oriented physics.
Verified Network Configuration. Verinec Goals Device independent network configuration Automated testing of configuration Automated distribution of configuration.
LOGO PROOF system for parallel MPD event processing Gertsenberger K. V. Joint Institute for Nuclear Research, Dubna.
R.T. Jones, Newport News, May The GlueX Simulation Framework GEANT4 Tutorial Workshop Newport News, May 22-26, 2006 R.T. Jones, UConn Monte Carlo.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
November 2013 Review Talks Morning Plenary Talk – CLAS12 Software Overview and Progress ( ) Current Status with Emphasis on Past Year’s Progress:
FLUKA dose and fluence simulations for CBM experiment I.Kadenko, O.Bezshyyko, V.Pluyko, V.Shevchenko National Taras Shevchenko University of Kiev.
A Data Access Framework for ESMF Model Outputs Roland Schweitzer Steve Hankin Jonathan Callahan Kevin O’Brien Ansley Manke.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
DGC Paris WP2 Summary of Discussions and Plans Peter Z. Kunszt And the WP2 team.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
SPARRO Group, University of Regina 1 Portal Software: Browser-based Monte Carlo Zisis Papandreou University of Regina GlueX Collaboration Meeting JLab,
GLAST LAT Offline SoftwareCore review, Jan. 17, 2001 Review of the “Core” software: Introduction Environment: THB, Thomas, Ian, Heather Geometry: Joanne.
D R A T D R A T ABSTRACT Every semester each department at Iowa State University has to assign its faculty members and teaching assistants (TAs) to the.
Model Design using Hierarchical Web-Based Libraries F. Bernardi Pr. J.F. Santucci {bernardi, University of Corsica SPE Laboratory.
Jean-Roch Vlimant, CERN Physics Performance and Dataset Project Physics Data & MC Validation Group McM : The Evolution of PREP. The CMS tool for Monte-Carlo.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Intro to Web Services Dr. John P. Abraham UTPA. What are Web Services? Applications execute across multiple computers on a network.  The machine on which.
TOPIC 7.0 LINUX SERVICES AND CONFIGURATION. ROOT USER Root user is called “super user” because it has power far beyond those of mortal user. As root,
Nanbor Wang, Balamurali Ananthan Tech-X Corporation Gerald Gieraltowski, Edward May, Alexandre Vaniachine Argonne National Laboratory 2. ARCHITECTURE GSIMF:
Application Web Service Toolkit Allow users to quickly add new applications GGF5 Edinburgh Geoffrey Fox, Marlon Pierce, Ozgur Balsoy Indiana University.
Gridmake for GlueX software Richard Jones University of Connecticut GlueX offline computing working group, June 1, 2011.
The V-Atlas Event Visualization Program J. Boudreau, L. Hines, V. Tsulaia University of Pittsburgh A. Abdesselam University of Oxford T. Cornelissen NIKHEF.
STAR Scheduler Gabriele Carcassi STAR Collaboration.
DZero Monte Carlo Production Ideas for CMS Greg Graham Fermilab CD/CMS 1/16/01 CMS Production Meeting.
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
05/14/04Larry Dennis, FSU1 Scale of Hall D Computing CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
LHCb Software Week 25/11/99 Gonzalo Gracia Abril 1 r Status of Geant4 in LHCb. r Ideas on how to populate the LHCb Detector Description Data Base (LHCb.
Hall-D Software Status September 10, 2004 D. Lawrence JLab.
Simulation Production System
Database Replication and Monitoring
NA61/NA49 virtualisation:
CMS High Level Trigger Configuration Management
Complex Geometry Visualization TOol
European Organization for Nuclear Research
Simulation Framework Subproject cern
Presentation transcript:

GlueX Computing GlueX Collaboration Meeting – JLab Edward Brash – University of Regina December 11 th -13th, 2003

VRVS Videoconferences - -> web-based videoconferencing site – currently free - audio/video/chat services, along with desktop sharing using VNC - WXP and Linux compatibility -> might even work with a Mac !! - series of videoconferences since the Glasgow meeting. We are getting fairly proficient at setting up and getting going quickly now. - A large fraction of the work that has been done since the Glasgow has been motivated and directed via these meetings -> very useful motivator - Polycom systems at FSU and Uconn – totally hands free -> plans to set another system up at Regina soon.

GlueX Computing Pre-History The Uconn Workfest Goals for the Computing Environment 1. The experiment must be easy to conduct. 2. Everyone can participate in solving experimental problems – no matter where they are located. 3. Offline analysis can more than keep up with the online acquisition. 4. Simulations can more than keep up with the online acquisition. 5. Production of tracks/clusters from raw data and simulations can be planned, conducted, monitored, validated and used by a group. 6. Production of tracks/clusters from raw data and simulations can be conducted automatically with group monitoring. 7. Subsequent analysis can be done automatically if individuals so choose.

Hall D Data Model - The boxes represent well-defined stages in the analysis path where the data can be “viewed”. - The lines represent specific computing tasks. - At the time of the workfest, it was agreed upon by those present that it should be possible to easily create and XML description of the viewable data at all points in the analysis path. - Note that in this schematic, there is a single geometric and kinematic reconstruction package.

Hall D Data Model - Each data file or i/o port of a program is associated with an XML schema that defines the data structure that the program expects or produces. Any XML document that is valid according to the schema should be accepted by the program. - The main purpose of the HDDM is to simpify the programmer's task by providing automatic ways to generating schemas, and to standardize the input and output of XML data. - Programmers are NOT obligated to use the HDDM tools to work in the GlueX software framework. They can provide their own schemas for each file or i/o port used by the program, which must necessarily accept any XML that respects said schema. - New Tools:hddm-schema -> extracts the XML metadata from a hddm file file header, and generates a schema schema-hddm -> reads a schema, and checks it for compliance with the hddm specification

Hall D Detector Specification - XML-based geometry specification for the Hall D detector and beam line. - In the near term, the need is to have a single reference for geometrical parameters used in Monte Carlo simulations. In the long term, most elements in the analysis path will have need for geometry information. - The idea to use XML for the geometry specification was inspired by ATLAS. Also, ROOT is planning to develop XML geometry parsing tools in the near future (Presented at CHEP 2003) - Immediate Problems to Solve: 1. How does one conveniently visualize the geometry? 2. How does one test new geometries?

Hdds-root - ROOT provides a suite of tools for geometry visualization. There is a lot of development currently going on in this area – driven by ATLAS and other projects. - ROOT includes with the normal distribution a utility, g2root, which translates a geant3 geometry description into a macro which recreates this geometry within ROOT. - hdds-root generates this ROOT macro directly from the XML geometry description.

Future Developments - The ROOT geometry browser is currently somewhat rudimentary. We plan to develop a new browser that has both more functionality, and as well is more tailored to our specific needs in GlueX. At the same time, we will closely monitor the ROOT XML development effort. - Development of “hits” viewing within ROOT - HDGeant hddm output -> XML output -> ROOT visualization of hits - Development of a geometrical reconstruction package – ROOT-based?

project manager client browser web server GUI applet project 1 project 2 project 3 job scheduler job instance slave node text/html/xml project-wide nfs domain OpenShop cluster service: present architecture (2002) unix shell ssh shell http cgi other slave node job server switchboard job queue

project manager project 1 project 2 project 3 job scheduler job instance slave node text/html/xml project-branch nfs domain OpenShop cluster service: architechture rev. 1 (2003) slave node job server switchboard web server client browser GUI applet unix shell ssh shell http cgi other web server

project manager project 1 project 2 project 3 job scheduler job instance slave node text/html/xml project-branch nfs domain OpenShop cluster service: architechture rev. 2 (2004) slave node job server switchboard web server client browser GUI applet web server text/html/xml web services web client soap unix shell ssh shell http cgi other

A FileSharing Service with GT3 - Version 3 of the Globus Toolkit includes a great many new developments: -> simple installation – much more “ready for prime time” -> easy to add new web services -> core release (at least) is entirely java-based – multiple architecture support - FileSharing Service -> can be a very useful concept within the context of coordinating simulation efforts from multiple clusters and/or sites.