Near Future Plans for the Space Charge Team after SC-13

Slides:



Advertisements
Similar presentations
DNA Computing with Logic Gates. Traditional Logic Gates An XOR Gate Implementation:
Advertisements

IIAA GPMAD A beam dynamics code using Graphics Processing Units GPMAD (GPU Processed Methodical Accelerator Design) utilises Graphics Processing Units.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
1 Simulation Status/Plans Malcolm Ellis Sci Fi Tracker Meeting Imperial College, 10 th September 2004.
1 PIC versus Frozen? 5/27/2013 FSOutcome of SC-13 PIC codes are definitely needed when coherent effects are relevant. In our case in presence of strong.
Compilers and Interpreters. Translation to machine language Every high level language needs to be translated to machine code There are different ways.
Frank L. H. WolfsDepartment of Physics and Astronomy, University of Rochester SLC Meeting July 16, 2003 Issues to be discussed: Time table for catalyst.
Introduction Status of SC simulations at CERN
Success status, page 1 Collaborative learning for security and repair in application communities MIT & Determina AC PI meeting July 10, 2007 Milestones.
GRD - Collimation Simulation with SIXTRACK - MIB WG - October 2005 LHC COLLIMATION SYSTEM STUDIES USING SIXTRACK Ralph Assmann, Stefano Redaelli, Guillaume.
Site License Advisory Team Adobe Creative Cloud December 10, 2014 meeting.
1 SC-13 – Consequences for us? Overview of the SC-13 Workshop Our Week with the Code Experts Next Plans 5/27/2013 FSOutcome of SC-13.
Cluster currently consists of: 1 Dell PowerEdge Ghz Dual, quad core Xeons (8 cores) and 16G of RAM Original GRIDVM - SL4 VM-Ware host 1 Dell PowerEdge.
PTC ½ day – Experience in PS2 and SPS H. Bartosik, Y. Papaphilippou.
Claudio Grandi INFN Bologna CMS Operations Update Ian Fisk, Claudio Grandi 1.
HSM Meeting - HPC - FS High Performance Computing (HPC) Support from IT 1 Historical Overview of IT Computing Support & Present State Present.
IMPLEMENTATION OF SOFTWARE INPUT OUTPUT CONTROLLERS FOR THE STAR EXPERIMENT J. M. Burns, M. Cherney*, J. Fujita* Creighton University, Department of Physics,
Hosted Virtualization Lab Last Update Copyright Kenneth M. Chipps Ph.D.
PTC Integration into MAD-X What is PTC? (Etienne’s words) Some Facts about PTC What are the advantages for MAD-X? Magnet Treatment in PTC How will we use.
4 CAPS PROJECTS By: Lev Kavs. What I have done so far  Figured out what environment I will be using(Eclipse)  Set up my environment (3x)  Fixed missing.
Status of Space-Charge Simulations with MADX Valery KAPIN ITEP & MEPhI, Moscow GSI, 19-Feb-2009
Software Status  Last Software Workshop u Held at Fermilab just before Christmas. u Completed reconstruction testing: s MICE trackers and KEK tracker.
MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 27 th, 2001 Final Presentation.
Experiment in PS G. Franchetti, GSI CERN, 20-21/5/ /05/14G. Franchetti1.
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
Franco Carbognani, EGO LSC-Virgo Meeting May 2007 Status and Plans LIGO-G Z Software Management.
Frank SchmidtSpace Charge # 31 Style of the Meeting I have agreed with Elias that we want meetings without fear to ask any crazy question! Fancy presentations.
CSC 480 Software Engineering Test Planning. Test Cases and Test Plans A test case is an explicit set of instructions designed to detect a particular class.
Steady State Discharge Modeling for KSTAR C. Kessel Princeton Plasma Physics Laboratory US-Korea Workshop - KSTAR Collaborations, 5/19-20/2004.
Industrial Participation & SRF Infrastructure at Fermilab Phil Pfund with input from Harry Carter, Rich Stanek, Mike Foley, Dan Olis, and others.
Beam-beam compensation at RHIC LARP Proposal Tanaji Sen, Wolfram Fischer Thanks to Jean-Pierre Koutchouk, Frank Zimmermann.
Othello Artificial Intelligence With Machine Learning Computer Systems TJHSST Nick Sidawy.
Ideas for Super LHC tracking upgrades 3/11/04 Marc Weber We have been thinking and meeting to discuss SLHC tracking R&D for a while… Agenda  Introduction:
SC-25 FS1 Organization of the SC meetings I Popular vote suggests that we see every second week. From now on we will meet regularly every second Thursday.
Most Common Fund spending now goes on infrastructure. In the course of next year, essentially all the remaining funds will be spent on finishing the sub-detectors.
APC General Meeting 11 March 2008 Theory & Simulations Group PeoplePositionsProjects Y. AlexahinSci.II (GL)Run II, MC, PP-II, LARP V. BalbekovSci.IIMC,
1 Trip to BNL Goal Installation of newest MAD-X at BNL Setting up space charge for RHIC in frozen mode They do not really want a renormalization of sigma.
SC-30 FS1 Communication from SYNERGIA about GSI benchmarking 14/11/2013.
Pushing the space charge limit in the CERN LHC injectors H. Bartosik for the CERN space charge team with contributions from S. Gilardoni, A. Huschauer,
BNL trip  Goal of the BNL-FERMILAB- CERN collaboration  The codes  BB tune foot-prints  DA studies.
SC-25 FS1 Code Overview PTC-Orbit Synergia Micromap MAD-X with Frozen Space Charge 8/29/2013.
PTC-ORBIT issues for PSB simulations V. Forte – E. Benedetto – M. Martini S.c. meeting (18/07/2013)
WLCG Accounting Task Force Update Julia Andreeva CERN GDB, 8 th of June,
Parag Mhashilkar Computing Division, Fermilab.  Status  Effort Spent  Operations & Support  Phase II: Reasons for Closing the Project  Phase II:
How to Contribute to System Testing and Extract Results
ICE SECTION The coolest place to be! Elias Métral
Detector building Notes of our discussion
People who attended the meeting:
Operating systems and Internet Infrastructure services
Document Plan & Milestones WP7
Multi-Turn Extraction studies and PTC
PXD Summary Tests PXD9 Pilot Production ASICs Components Plans:
PES Lessons learned from large scale LSF scalability tests
Sabrina Appel, GSI, Beam physics Space charge workshop 2013, CERN
Space Charge Study Group
5 SYSTEM SOFTWARE CHAPTER
Tatia Engelmore, Columbia University
Top 5 Hardware Issues And Troubleshoot By I FIX PC
for more information ... Performance Tuning
Lesson 2 Understanding Software Bugs
UPDATE ON DYNAMIC APERTURE SIMULATIONS
SC Overview 2013 White & Rouge The Codes in Comparison The Noise Issue
Parallel Cartographic Modeling
James N. Bellinger University of Wisconsin at Madison 6 October 2010
5 SYSTEM SOFTWARE CHAPTER
Chapter 8 Software Evolution.
Black Box Software Testing (Professional Seminar)
Simulation Review Outline Cast of Characters Simulation Code Road Map
Expanding the PHENIX Reconstruction Universe
Non-linear (Effective) Modeling of Accelerators
Presentation transcript:

Near Future Plans for the Space Charge Team after SC-13 Code Consolidation Crashes Benchmarking/Mitigation Frozen Space Charge Computer Resources Next steps 6/12/2013 Near Future Plans 1

Code Consolidation I Raymod has found a crash of PTC-ORBIT after long runs. Seems to be related to code issues in relation with MPI ➔ Harry is on the problem The problem might be better with the newer MPICH2 and and there is a potential speed-up Therefore, I have installed the newest MPICH2 version on our space charge AFS space. At the same place there are now also the latest PTC-ORBIT versions for SLC5 and SLC6 with and without MPICH2 6/12/2013 Near Future Plans 2

Code Consolidation II Our collaborations are intensifying, both Jeff & James are working hard to get the benchmarking done: ORBIT shows the noise Frank had detected at Fermilab albeit less dramatic (most likely) due to symmetrized distributions SYNERGIA has a couple of serious issues The single particle analysis has shown to be essential All that is done in collaboration with Giuliano Leonid is continuing the mitigation effort for ORBIT 6/12/2013 Near Future Plans 3

Code Consolidation III Our In-house code MAD-X with space charge is being put to a test with the RHIC case It is already benchmarked in Giuliano’s Suite but what is missing is a more practical test The setting up phase is a mess and will have to be fixed eventually An attempt will be made to get it working under MPI to achieve acceptable performance After these upgrades the code should complement our set of simulation tools 6/12/2013 Near Future Plans 4

Computing Resources The original 10 of 48 core machines have been reduced to 8 due our scarce usage during the last year. Bernd from IT has noticed a pick-up of activity during the last month In the coming weeks when our usage continues he promised to double the resources Raymond pointed out that our PTC-ORBIT tracking remains very slow even on the 48 core machines: Harry confirms that the code will be tough to be sped up Synergia is much faster but is not yet operational The 48 core can not be combined to achieve speed-up of individual runs due to slow communication speed A super-computer is out of reach and we also have to first consolidate our code 6/12/2013 Near Future Plans 5

Next Steps We know need to outline a plan for our simulations studies for all 3 machines We should start despite the fact that we still need to consolidate all our simulation tools In particular now we have to start making an in-depth benchmarking of our experiments with our codes In collaboration with GSI we are in the process to get a better understanding of the theory of resonances in conjunction with space charge ➔ joint supervision of students and visits (in July Frank will go there) We are looking for another student to clarify how to combine studies with PIC (self-consistent) with frozen space charge codes 6/12/2013 Near Future Plans 6