User Forum 2006 Francisco Castejón As coordinator of NA4-Fusion: SW-Federation (CIEMAT, BIFI, UCM, INTA -Spain-), Russian.

Slides:



Advertisements
Similar presentations
EXTENDED MHD SIMULATIONS: VISION AND STATUS D. D. Schnack and the NIMROD and M3D Teams Center for Extended Magnetohydrodynamic Modeling PSACI/SciDAC.
Advertisements

Introduction to Plasma-Surface Interactions Lecture 6 Divertors.
6th Japan Korea workshop July 2011, NIFS, Toki-city Japan Edge impurity transport study in stochastic layer of LHD and scrape-off layer of HL-2A.
Physics of fusion power Lecture 6: Conserved quantities / Mirror device / tokamak.
Analysis of instrumental effects in HIBP equilibrium potential profile measurements on the MST-RFP Xiaolin Zhang Plasma Dynamics Lab, Rensselaer Polytechnic.
Physics of fusion power
Physics of fusion power Lecture 8: Conserved quantities / mirror / tokamak.
Physics of fusion power
Chamber Dynamic Response Modeling Zoran Dragojlovic.
Physics of Fusion power Lecture 7: Stellarator / Tokamak.
Physics of fusion power Lecture 2: Lawson criterion / Approaches to fusion.
Massively Parallel Magnetohydrodynamics on the Cray XT3 Joshua Breslau and Jin Chen Princeton Plasma Physics Laboratory Cray XT3 Technical Workshop Nashville,
Advanced Tokamak Plasmas and the Fusion Ignition Research Experiment Charles Kessel Princeton Plasma Physics Laboratory Spring APS, Philadelphia, 4/5/2003.
Thermoacoustics in random fibrous materials Seminar Carl Jensen Tuesday, March
1 Association Euratom-Cea TORE SUPRA Tore Supra “Fast Particles” Experiments LH SOL Generated Fast Particles Meeting Association Euratom IPP.CR, Prague.
PIC simulations of the propagation of type-1 ELM-produced energetic particles on the SOL of JET D. Tskhakaya 1, *, A. Loarte 2, S. Kuhn 1, and W. Fundamenski.
IPP - Garching Reflectometry Diagnostics and Rational Surface Localization with Fast Swept Systems José Vicente
Introduction to the Particle In Cell Scheme for Gyrokinetic Plasma Simulation in Tokamak a Korea National Fusion Research Institute b Courant Institute,
The principle of SAMI and some results in MAST 1. Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui, , China 2. Culham Centre.
Visualisation of Plasma in Fusion Devices Interactive European Grid 30 th May 2007.
A Grid fusion code for the Drift Kinetic Equation solver A.J. Rubio-Montero, E. Montes, M.Rodríguez, F.Castejón, R.Mayo CIEMAT. Avda Complutense, 22. Madrid.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Introduction to Plasma- Surface Interactions Lecture 3 Atomic and Molecular Processes.
SiO2 ETCH PROPERTIES AND ION ENERGY DISTRIBUTION IN PULSED CAPACITIVELY COUPLED PLASMAS SUSTAINED IN Ar/CF4/O2* Sang-Heon Songa) and Mark J. Kushnerb)
ARIES-AT Physics Overview presented by S.C. Jardin with input from C. Kessel, T. K. Mau, R. Miller, and the ARIES team US/Japan Workshop on Fusion Power.
RF simulation at ASIPP Bojiang DING Institute of Plasma Physics, Chinese Academy of Sciences Workshop on ITER Simulation, Beijing, May 15-19, 2006 ASIPP.
Physics of fusion power Lecture 9 : The tokamak continued.
14 Oct. 2009, S. Masuzaki 1/18 Edge Heat Transport in the Helical Divertor Configuration in LHD S. Masuzaki, M. Kobayashi, T. Murase, T. Morisaki, N. Ohyabu,
INFSO-RI Enabling Grids for E-sciencE Workflows in Fusion applications José Luis Vázquez-Poletti Universidad.
Max-Planck-Institut für Plasmaphysik, EURATOM Association Different numerical approaches to 3D transport modelling of fusion devices Alexander Kalentyev.
Discharge initiation and plasma column formation in aspect ratio A=2 tokamak. R.R. Khayrutdinov 1 E.A. Azizov 1, A.D. Barkalov 1, G.G.Gladush 1, I.L.Tajibaeva.
Compact Stellarator Approach to DEMO J.F. Lyon for the US stellarator community FESAC Subcommittee Aug. 7, 2007.
Comprehensive ITER Approach to Burn L. P. Ku, S. Jardin, C. Kessel, D. McCune Princeton Plasma Physics Laboratory SWIM Project Meeting Oct , 2007.
Midwest Accelerator Physics Meeting. Indiana University, March 15-19, ORBIT Electron Cloud Model Andrei Shishlo, Yoichi Sato, Slava Danilov, Jeff.
EGEE USER FORUM, 2007 #0 F. Castejón 1,7 I. Campos 2, A. Cappa 1, M. C á rdenas 1, L. A. Fernández 3,7, E. Huedo 4, I.M. Llorente.
QAS Design of the DEMO Reactor
Some slides on UCLA LM-MHD capabilities and Preliminary Incompressible LM Jet Simulations in Muon Collider Fields Neil Morley and Manmeet Narula Fusion.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
Numerical Model of an Internal Pellet Target O. Bezshyyko *, K. Bezshyyko *, A. Dolinskii †,I. Kadenko *, R. Yermolenko *, V. Ziemann ¶ * Nuclear Physics.
Transport analysis of the LHD plasma using the integrated code TASK3D A. Wakasa, A. Fukuyama, S. Murakami, a) C.D. Beidler, a) H. Maassberg, b) M. Yokoyama,
52nd Annual Meeting of the Division of Plasma Physics, November , 2010, Chicago, Illinois Non-symmetric components were intentionally added to the.
INFSO-RI Enabling Grids for E-sciencE Fusion Status Report Francisco Castejón CIEMAT. Madrid, Spain.
RMS Dynamic Simulation for Electron Cooling Using BETACOOL He Zhang Journal Club Talk, 04/01/2013.
NSTX Meeting name – abbreviated presentation title, abbreviated author name (??/??/20??) Goals of NSTX Advanced Scenario and Control TSG Study, implement,
1 Recent Progress on QPS D. A. Spong, D.J. Strickler, J. F. Lyon, M. J. Cole, B. E. Nelson, A. S. Ware, D. E. Williamson Improved coil design (see recent.
Presented by Yuji NAKAMURA at US-Japan JIFT Workshop “Theory-Based Modeling and Integrated Simulation of Burning Plasmas” and 21COE Workshop “Plasma Theory”
Bootstrap current in quasi-symmetric stellarators Andrew Ware University of Montana Collaborators: D. A. Spong, L. A. Berry, S. P. Hirshman, J. F. Lyon,
2014/03/06 那珂核融合研究所 第 17 回若手科学者によるプラズマ研究会 SOL-divertor plasma simulations with virtual divertor model Satoshi Togo, Tomonori Takizuka a, Makoto Nakamura.
EGEE is a project funded by the European Union under contract IST Compchem VO's user support EGEE Workshop for VOs Karlsruhe (Germany) March.
Hard X-rays from Superthermal Electrons in the HSX Stellarator Preliminary Examination for Ali E. Abdou Student at the Department of Engineering Physics.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks FAFNER2: adaptation of a code for estimating.
54 th APS-DPP Annual Meeting, October 29 - November 2, 2012, Providence, RI Study of ICRH and Ion Confinement in the HSX Stellarator K. M. Likin, S. Murakami.
57th Annual Meeting of the Division of Plasma Physics, November , Savannah, Georgia Equilibrium Reconstruction Theory and Modeling Optimized.
E-science grid facility for Europe and Latin America Drift Kinetic Equation solver for Grid (DKEsG) A J. Rubio-Montero 1, L. A. Flores 1,
E-science grid facility for Europe and Latin America Executions of a Fusion Drift Kinetic Equation solver on Grid A J. Rubio-Montero.
Enabling Grids for E-sciencE LRMN ThIS on the Grid Sorina CAMARASU.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
INFSO-RI Enabling Grids for E-sciencE Kurchatov Institute Genetic Stellarator Optimisation in Grid Vladimir Voznesensky
Unstructured Meshing Tools for Fusion Plasma Simulations
Recent Progress in Stellarator Optimization
Executions of the DKES code on the EELA-2 e-Infrastructure
Chamber Dynamic Response Modeling
Physics of fusion power
Stellarator Divertor Design and Optimization with NCSX Examples
Fusion in the GRID (na4-egeeII) Francisco Castejón
Generation of Toroidal Rotation by Gas Puffing
Finite difference code for 3D edge modelling
Mikhail Z. Tokar and Mikhail Koltunov
Presentation transcript:

User Forum 2006 Francisco Castejón As coordinator of NA4-Fusion: SW-Federation (CIEMAT, BIFI, UCM, INTA -Spain-), Russian Federation (Kurchatov Institute -Russia-), CEA (France), ENEA (Italy), Culham (UK), Korea (KISTI) Fusion activities in the GRID (na4-egeeII)

User Forum 2006 Outline  Motivation: Fusion on the GRID.  Data storage and handling.  Fusion Deployment and VO setup.  Applications: Computing in Plasma Physics.  Future Applications in the grid.  Final Remarks.  Motivation: Fusion on the GRID.  Data storage and handling.  Fusion Deployment and VO setup.  Applications: Computing in Plasma Physics.  Future Applications in the grid.  Final Remarks.

User Forum 2006 Motivation  Large Nuclear Fusion installations: International Cooperation among several Institutes.  Generate ~ 1-10 GB/sec. Less than 30% of data goes into processing.  Distributed data storage and handling needed.  Massive Distributed Calculation: A new way of solving problems. (Physics Problems still without solution).  Fusion community (Science and Technology) needs new IT approaches to increase research productivity.  Large Nuclear Fusion installations: International Cooperation among several Institutes.  Generate ~ 1-10 GB/sec. Less than 30% of data goes into processing.  Distributed data storage and handling needed.  Massive Distributed Calculation: A new way of solving problems. (Physics Problems still without solution).  Fusion community (Science and Technology) needs new IT approaches to increase research productivity.

User Forum 2006 JET - Pulse Pulse Preparation Simulation Modelling Real-time control Data Analysis Data Acquisition International Project Remote Participation IT Security Nuclear Registered Site Data Management Supercomputers

NA4-Catania, 2006 ITER: Making decisions in real Time !! Data Acquisition and Storage (Grid, Supercomputers) Data Acquisition and Storage (Grid, Supercomputers) Data Analysis and Reduction: Artificial Intelligence, Neural Network, Pattern Recognition Simulations: Large codes in different platforms (Grid, Supercomputers) Decision for present/next shot One half an hour shot every one and half an hour: Decisions in real time.

User Forum 2006 ITER Partners Distributed Participation. Data access. Remote Control Rooms?

NA4-Catania, 2006 DATA HANDLING Storage: Large data flux: 10 4 sensors x kHz sampling= 1-10 GBy per second raw data x 0.5 h= 3 TBy per shot in ITER every 1,5 h Supercomputing and Grid Computing --> Data Storage: Scratch and permanent. Data Access & Sharing in Large Cooperative Experiments: A unified representation for experimental data native storages that LCG-2/gLite CE middleware could understand is needed. (get and put the data files from and into the storage). Storage should allow to do some basic processing: neural network, clustering…

User Forum 2006 Communications Remote Participation tools: Data Access Local Visualization Video Conferences and Chats Remote Control SECURITY & ROBUSTNESS

User Forum 2006 PARTNERS and Resources for VO  SW Federation: CIEMAT, BIFI, UCM, INTA (Spain)  Kurchatov (Russia).  Culham Laboratory- UKAEA (UK)  KISTI (South Korea).  ENEA (Italy).  CEA (France).  SW Federation: CIEMAT, BIFI, UCM, INTA (Spain)  Kurchatov (Russia).  Culham Laboratory- UKAEA (UK)  KISTI (South Korea).  ENEA (Italy).  CEA (France). Experience in using and developing Fusion Applications. Experience in porting applications and developing Grid Technologies. --> Connection with EELA (Some fusion partners: Brazil, Mexico, Argentina)

User Forum 2006 VO Deployment  Present: CIEMAT: 27 KSpecInts; BIFI: 8 KSpecInts; INTA: 6 nodes   Within less than 6 months: JET: 38 KSpecInts; BIFI: 32 KSpecInts; CEA-Cadarache ? KISTI?, INTA?, ENEA?   Within less than 6 months: JET: 38 KSpecInts; BIFI: 32 KSpecInts; CEA-Cadarache ? KISTI?, INTA?, ENEA?   Beginning of 2007: JET: 32 adtional cores; BIFI: 32 additional cores; CIEMAT ?; CEA-Cadarache ?(second phase already committed).   Beginning of 2007: JET: 32 adtional cores; BIFI: 32 additional cores; CIEMAT ?; CEA-Cadarache ?(second phase already committed).

User Forum 2006 COMPUTING in the GRID: Present Applications  Applications with distributed calculations: Monte Carlo, Separate estimates, …  Multiple Ray Tracing: e. g. TRUBA.  Stellarator Optimization: VMEC  Transport and Kinetic Theory: Monte Carlo Codes.  Applications with distributed calculations: Monte Carlo, Separate estimates, …  Multiple Ray Tracing: e. g. TRUBA.  Stellarator Optimization: VMEC  Transport and Kinetic Theory: Monte Carlo Codes.

User Forum 2006 Multiple Ray Tracing: TRUBA Single Ray (1 PE): Hamiltonian Ray Tracing Equations. Beam Simulation: Bunch of rays with beam waist far from the critical layer ( rays) Bunch of rays with beam waist close to the critical layer ( rays) x ( wave numbers) ~10 5 GRID PROBLEM

User Forum 2006 TRUBA: Multiple Ray Tracing Different results with the two approximations. Different results with the two approximations. (Also useful tool for looking for Optimum Launching Position in complex devices) (Also useful tool for looking for Optimum Launching Position in complex devices) TRUBA for EBW: TRUBA for EBW: Collaboration between IOFAN and CIEMAT. Collaboration between IOFAN and CIEMAT. Useful for all Institutes with EBW heating (Culham, Princeton, Greifswald, CIEMAT,…) Useful for all Institutes with EBW heating (Culham, Princeton, Greifswald, CIEMAT,…)

User Forum 2006 TRUBA: Multiple Ray Tracing TRUBA for EBW: TRUBA for EBW: - Cylinder geometry: A single Non-relativistic ray (tens of sec.) Cylinder geometry: A single Non-relativistic ray (tens of sec.) Cylinder geometry: A single Non-relativistic ray (tens of sec.) -Real geometry in TJ-II:Coming from a supercomputer (VMEC). Real geometry in TJ-II:Coming from a supercomputer (VMEC).Real geometry in TJ-II:Coming from a supercomputer (VMEC). - A single Non-relativistic ray (about 18’). A single Non-relativistic ray (about 18’). A single Non-relativistic ray (about 18’). - A single relativistic ray (about 40’). A single relativistic ray (about 40’). A single relativistic ray (about 40’). - Some problems with Geometry libraries. Some problems with Geometry libraries. Some problems with Geometry libraries. -See: See: J. L. Vázquez-Poleti. “Massive Ray Tracing in Fusion Plasmas on EGEE”. User Forum, J. L. Vázquez-Poleti. “Massive Ray Tracing in Fusion Plasmas on EGEE”. User Forum, 2006.

User Forum 2006 TJ-II: R= 1.5 m; a= 0.2 m, M=4, l=1. HSX: R=1.2 m; a=0.15 m LHD: R= 3.75 m; a= 0.65 m, M=10, l=2. CHS: R= 1 m; a= 0.2 m; M=8; l=2. Stellarator Optimization: Choosing the best concept

User Forum 2006 Optimised Stellarators QPS and NCSX Supercomputer Optimization NCSXQPS

User Forum 2006 The importance of Stellarator optimization. - - Design of new Plasma Physics-Fusion devices under complex exigencies. OPTIMIZATION NECESITY BASED ON KNOWLEDGE OF STELLARATOR PHYSICS. Many Labs looking for the optimized device.

User Forum 2006 Stellarator optimization in the Grid - - A lot of different Magnetic Configurations operating nowadays. OPTIMIZATION NECESITY BASED ON KNOWLEDGE OF STELLARATOR PHYSICS. Every variant computed on a separate processor (~10’) VMEC (Variational Momentum Equilibrium Code) 120 Fourier parameters are varied. Coils producing field confining the plasma may be optimised numerically by variation of the field parameters.

User Forum Neoclassical Transport. - Bootstrap current. - Equilibrium vs. plasma pressure. - Stability (Balloning, Mercier,…) Partícle trajectories in W7-X Optimization Criteria: Target Functions - -Genetic Algorithm to detect the optimum configuration for given criteria. Target Functions can be modified.

User Forum 2006 VMEC on Kurchatov GRID  LCG-2 - based Russian Data Intensive Grid consortium resources.  About cases computed (about was not VMEC-computable, i.e. no equilibrium).  Each case took about 20 minutes.  Up to 70 simultaneous jobs running on the grid. -See: V. Voznesensky. “Genetic Stellarator Optimisations in Grid”. User Forum,  LCG-2 - based Russian Data Intensive Grid consortium resources.  About cases computed (about was not VMEC-computable, i.e. no equilibrium).  Each case took about 20 minutes.  Up to 70 simultaneous jobs running on the grid. -See: V. Voznesensky. “Genetic Stellarator Optimisations in Grid”. User Forum, 2006.

User Forum 2006 Kinetic Transport  Following independent particle orbits  Montecarlo techniques: Particles distributed according to experimental density and ion temperature profiles (Maxwellian distribution function)  SUITABLE PROBLEM FOR CLUSTER AND GRID TECHNOLOGIES  Following independent particle orbits  Montecarlo techniques: Particles distributed according to experimental density and ion temperature profiles (Maxwellian distribution function)  SUITABLE PROBLEM FOR CLUSTER AND GRID TECHNOLOGIES

User Forum 2006 Kinetic Transport Example of orbit in the real 3D TJ-II Geometry (single PE). ~1 GBy data, 24 h x 512 PE Distribution function of parallel velocity at a given position (Data Analysis).

User Forum 2006 Kinetic transport No collisions: 0.5 ms of trajectory takes 1 sec. CPU.. No collisions: 0.5 ms of trajectory takes 1 sec. CPU.. Collisions: 1 ms of trajectory takes 4 sec CPU. Collisions: 1 ms of trajectory takes 4 sec CPU. Particle life: ms. Single particle ~ 10 min. Particle life: ms. Single particle ~ 10 min. Necessary statistics for TJ-II 10 7 particles. Necessary statistics for TJ-II 10 7 particles.

NA4-Catania, 2006 Read from disk: Input file (“input.lis”): Number of files. Number of trajectories in each file (may be divided in blocks inside the file). (total statistics = number of files x number of trajectories per file) Length of each trajectory. Integration algorithm and time discretization. Number of measures in time. Temperature of backgroud electrons and ions. Random seed. Initial distribution (in space and velocity space). Model used, including or not: Collisions with backgroud ions. Electric field. T-density dependence. Starting point according to a (space) distribution function (“perfil_densidad.in”). Magnetic configuration (~ 20 MB file -> stored at the Fusion VO catalog) (“42_100_68.c3d”) Magnetic field and derivatives. Magnetic coordinates -> electric field. Distance to the vaccum chamber. Measurement times (“tiempos.lis”) Read from disk: Input file (“input.lis”): Number of files. Number of trajectories in each file (may be divided in blocks inside the file). (total statistics = number of files x number of trajectories per file) Length of each trajectory. Integration algorithm and time discretization. Number of measures in time. Temperature of backgroud electrons and ions. Random seed. Initial distribution (in space and velocity space). Model used, including or not: Collisions with backgroud ions. Electric field. T-density dependence. Starting point according to a (space) distribution function (“perfil_densidad.in”). Magnetic configuration (~ 20 MB file -> stored at the Fusion VO catalog) (“42_100_68.c3d”) Magnetic field and derivatives. Magnetic coordinates -> electric field. Distance to the vaccum chamber. Measurement times (“tiempos.lis”) Kinetic transport: The application

User Forum 2006 Written on disk: ● ● “Raw” data ( “OUT????.DAT”) ● ● “Pictures” of the plasma at fixed times (persistence, total energy, kinetic energy, components of the velocity...) ● ● “Points” of escape of the particle from the TJ-II (according to several different criteria) ● ● ~ 0.9 KB / traj. (typically 1000 traj. / file -> ~ 900 KB) ● ● writen every ~1 hour ● ● Histograms (“OUT????.HIS”) ● ● Runtime histograms of serveral magnitudes (energies, componentes of velocity, fluxes...) as a function of spacial coordinates (rho, phi, theta) ● ● ~ 0.16 KB / traj. (typically 1000 traj. / file -> ~160 KB) ● ● writen every ~1 hour ● ● Debug (“RUNTIME.DAT”) ● ● Measurements at the end of the trajectories, just for check ● ● ~ 0.2 KB / file OUT???.DAT ● ● updated with every new OUT???.DAT (typically 1000 files -> ~200 KB) ● ● Individual trajectories (“ty_??_??_??.dat”) ● ● Complete information of each trajectory (time, position, velocity...) ● ● from ~10 MB to ~ GB / traj. depends on the length of the trajectory -> just for debug Total disk space = ~ 1.3 GB

User Forum 2006 COMPUTING in the GRID: Future applications  EDGE2D Application for tokamaks  Transport Analysis of multiple shots (typically 10 4 shots) or Predictive Transport with multiple models: e. g. ASTRA. CIEMAT(Spa) + IPP(Ger) + Kurchatov(Rus) + EFDA(UE) + …  Neutral Particle Dynamics: EIRENE: CIEMAT(Spa) + IPP(Ger)  EDGE2D Application for tokamaks  Transport Analysis of multiple shots (typically 10 4 shots) or Predictive Transport with multiple models: e. g. ASTRA. CIEMAT(Spa) + IPP(Ger) + Kurchatov(Rus) + EFDA(UE) + …  Neutral Particle Dynamics: EIRENE: CIEMAT(Spa) + IPP(Ger)

User Forum 2006 JET – Flagship of Worldwide Fusion: EDGE2D Equilibrium code.

User Forum 2006 Cross section of present EU D-shaped tokamaks compared to the ITER project EDGE2D: Determine plasma shape from Measurements: Plasma current, Pressure, Magnetic field… - -EDGE2D code solves the 2 D fluid equations for the conservation of energy, momentum and particles in the plasma edge region. - -Ions, electrons and all ionisation stages of multiple species are considered. - -Interaction with the vessel walls is simulated by coupling to monte-carlo codes, to provide the neutral ion and impurity sources.

User Forum 2006 Massive Transport Calculations For Instance: Enhanced heat Confinement in TJ-II. Lower heat diffusivity for low electron density and high absorbed power density. A different case on every PE.

User Forum 2006 EIRENE Code Trayectory of a He atom in TJ-II. Vertical and horizontal proyections. It starts in the green point and is absorbed in the plasma by an ionization process. The real 3D geometry of TJ-II vacuum chamber is considerd.

User Forum 2006 EIRENE Code Two parts: 1) Following trajectories (Totally distributed) --> GRID 2) Reduction to put all together. EIRENE Code comes from IPP (J ü lich, Germany) and is extensively used by Fusion community. Radial profile of atoms of He in TJ- II plasmas. An average on every magnetic surface has been done Radial profile of atoms of H in TJ- II

User Forum 2006 Final Remarks  GRID technologies will enhance Fusion Research: computing and data handling.  GRID technologies will win visibility when applied to large Fusion Experiments (like ITER).  Demonstration effect: If Fusion-Grid is succesful, GRID technologies will be extensively used by Fusion Community in the future.  FIRST APPLICATIONS ARE RUNNING IN THE GRID.  GRID technologies will enhance Fusion Research: computing and data handling.  GRID technologies will win visibility when applied to large Fusion Experiments (like ITER).  Demonstration effect: If Fusion-Grid is succesful, GRID technologies will be extensively used by Fusion Community in the future.  FIRST APPLICATIONS ARE RUNNING IN THE GRID.

NA4-Catania, 2006