Download presentation
Presentation is loading. Please wait.
Published byTrevor Higgins Modified over 9 years ago
1
ITPA NAKA, October 2007. P. Strand Plans and status of the European Task Force Integrated Tokamak Modelling Presented by: Pär Strand TF Leader : P. Strand, Deputies: L-G. Eriksson, M. Romanelli EFDA CSU Contact Person: K. Thomsen ITPA CDBM MEETING NAKA, OCTOBER 2007
2
ITPA NAKA, October 2007. P. Strand The aim of the task force is to: co-ordinate the development of a coherent set of validated simulation tools Benchmark these tools on existing tokamak experiments The ultimate aim is providing a comprehensive simulation package for ITER plasmas. The remit of the Task Force would extend to the development of the necessary standardized software tools for interfacing code modules and for accessing experimental data. In the medium term, this task force’s work would support the development of ITER-relevant scenarios in current experiments, while in the long term it would aim to provide a validated set of European modelling tools for ITER exploitation ITM Scope EFDA(03)-21/4.9.2 (June 24th, 2003) Executive summary
3
ITM-Milestones in EFDA workplan ITPA NAKA, October 2007. P. Strand 2008 - Start of the Gateway for IM and deployment of code platform – Implies a complete set of data structures and associated tools – A fully operational portal/workflow configuration – Major code releases from all of Integrated Modelling Projects 2009 - Extended set of platform tools forming a predictive core physics capacity for ITER – Production activities – local clusters and grid 2010/11 - Whole device modelling capability including comprehensive core-edge coupling and first principles elements – Aiming towards Broader Approach IFERC level computations
4
ITPA NAKA, October 2007. P. Strand Project Leadership 2007- NB: All tasks remain open for participation! ProjectsNameLeadership ISIPInfrastructure and Software Integration ProjectB. Guillerminet (CEA) G. Manduchi (RFX) IMP#1Equilibrium Reconstruction and Linear MHD Stability G. Huysmans (CEA) C. Konz (IPP) IMP#2Non-linear MHD and DisruptionsM. Ottaviani (CEA) S. Sharapov (UKAEA) IMP#3Transport Code and Discharge evolution (renewed for 2008 WP) D. Coster (IPP) D. Kalupin (FZJ), V. Basiuk (CEA), G. Pereverzev (IPP) IMP#4Transport Processes and Micro Stability (renewed for 2008 WP) B. Scott (IPP) Vacant IMP#5Heating Current Drive and Fast particles (Renewed for 2008 WP) T. Hellsten (VR) Y. Peysson (CEA) F. Zonca (ENEA) Tasks ITERPredictive ITER modelling and scenario development (review in November 2008 WP) V. Parail (UKAEA)
5
ITPA NAKA, October 2007. P. Strand Three Tiered Approach Portal more important in defining access to resources (certificates) Data access & application scheduling & Resource allocation through platform Visualization, monitoring & steering through platform Portal Workflow/ Code Platform Grid HPC “Gateway” Local Cluster Local Data Ext. Data USER Portal + Platform + Resources ITM tools: standardized data structures, data access, interface definitions, Gateway (2008), Workflow (Kepler)
6
ITPA NAKA, October 2007. P. Strand Shared Storage Data Area: resources to store large amounts of data generated by simulation codes and originating from experimental data of various fusion devices Servers and infrastructure: Master node, code platform server, fileserver, etc Computing resources: a farm of worker nodes to provide gateway elements. The software include operating environment (sys.op. distribute filesystem, resource management system, authentication system, backup). Access to Cresco HPC center. Technologies layout (schematic) Linux WADFS/PFS Auth SSO RMS SAN SWITCHes GE-FC-IB Storage Raid Array (Raid 6) Data servers 32/64bit DualCpu/DualCore (rackmount/Blade) (GE-FC) Worker Nodes 32/64bit DualCpu/DualCore Xeon/Opteron (rackmount/Blade) (GE-FC-IB) GE: GigaEthernet FC: Fibre Channel IB: InfiniBand Shared Storage Data Area Computing resources 100TB 16-32 nodes 64-128 CPU cores The Gateway - general layout User entry point Provided by ENEA start January 2008
7
Abstracted data structures –Description of a “Complete” set of data for describing plasma operation and simulations Abstracted through XML schemas Unambiguous description with agreed sign and other conventions SI-units (with eV) Consistent Physical Objects CPO –Groupings of related data - basis for code interfaces –Serializations Transformations providing –Language specific implementations of CPOs(f90, C++,..) –Database structures Access and Storage –Universal Access Layer providing invariant API based on CPO –Plug-in backend MDS+ currently being implemented - implications on data structures HDF5 considered as next step –Used in workflow tool to connect modules. Workflow orchestration –Kepler tools –Integration Tools (ISE, JNI editor, ….) Components - Infrastructure ITPA NAKA, October 2007. P. Strand
8
Full description of a tokamak : physics quantities + subsystems characteristics + diagnostics measurements “Object based” data structure : High degree of organisation : several subtrees corresponding to « Consistent Physical Objects » (avoid flat structures with long list of parameter names). –Subsystem : (e.g. a heating system, or a diagnostic) : will contain structured information on the hardware setup and the measured data by / related to this object. –Code results (e.g. a given plasma equilibrium, or the various source terms and fast particle distribution function from an RF code) : will contain structured information on the code parameters and the physics results. Codes communicate by exchanging CPOs only ( data consistency) Programming Language flexibility : use of recent software technologies : database structure is defined using XML schemas data structures - consistent physical objects
9
First versions available for JET, ASDEX, Tore Supra –How to verify correctness? (none is officially stamped) –Part of the Data structures ITPA NAKA, October 2007. P. Strand Machine Descriptions ITM will seek formal collaborations to expand description (ITER partners) and to fill with data (Int. Experiments - ITPA role?)
10
Working with ITM data structure The physics code developer does NOT need to know anything about XML The physics code developer must only make his code comply with –The logic of the CPOs –The hierarchy of data inside CPOs –Provide accurate version release and documentation of his code –Provide the list of arguments (CPOin, CPOout, number of time slices needed for each of them) and code-specific parameters, in a format to be defined ISIP support then wraps the code into the framework and links it to the access library (Universal Access Layer - UAL) No call to UAL inside the physics code ITPA NAKA, October 2007. P. Strand
11
Code must have I/O as : Subroutine Physics_module(CPOin1,….,CPOinN, CPOout1,…, CPOoutM) use euitm_schemas ! contains the type definitions of all CPOs Declaration of variables (CPO) must be as : type (type_equilibrium) :: my_equilibrium ! for one time slice type (type_equilibrium), pointer :: my_equilibrium(:) ! if the code handles several ! time slices at once (type_equilibrium is a derived type defined in euitm_schemas.f90, which is generated dynamically from the schemas … but the physics user may ignore this completely) Example of an ITM code (Fortran) : I/O ITPA NAKA, October 2007. P. Strand
12
Code adaptation Data access through the UAL: –Independent of the storage –The framework does not know our data Principles: –data transfer through a data server (in- memory for the simulations) –Advantages: Solve the problem of languages mixing Allow // computations Small changes: I/O => UAL Read euitmget() Write euitmput() Structure: ISIP Wrapper: Call euitmget(“CPO name”,CPOin) Call user_code(CPOin,CPOout) Use ITM structure Data access is separated from the computational part Call euitmput(“CPO name”,CPOout) codes EFIT, … In-memory codes HELENA, … codes MISHKA, … UAL
13
ITPA NAKA, October 2007. P. Strand Interfaces to Physics modules Data exchange between different codes/modules will be based on the Universal Access Layer
14
ITPA NAKA, October 2007. P. Strand in-memory Data server SOLOVIEV HELENA MISHKA Plot 2D Plot 3D
15
ITPA NAKA, October 2007. P. Strand Current Status and 2008 Activities user Lab. login user Lab. login user Lab. login portal Simulation editor Workflow editor Workflow engine Codes: Transport, heating,… UAL: C/C++, F95, Java, … Gateway Data servers: MDS+, Pub-Sub, HDF5 Clusters: JET, IPP, … GRID EGEE HPC applications resources Private data: XML files, … Data mining tools Code repositories: Mercurial, … Post-processing: Scilab,Visit,Root,… Catalogues Data structures almost complete: IMP#1 (ready), IMP#3 (some components missing – essentially ready), IMP#5 (being finalized) IMP#2 and IMP#4 code dependent additions needed
16
“Components” - Physics Physics activities –Organized in 5 different Integrated Modelling Projects or IMP’s Equilibrium Reconstruction and Linear MHD Stability Non-linear MHD and Disruptions Transport Code and Discharge evolution Transport Processes and Micro Stability Heating, Current Drive and Fast Particles –IMP’s are in charge of coordinating the needed physics development and to implement the tools into the ITM framework –The “Transport code” project acts in addition to its own direct development needs also as an integrator towards the whole device modelling task. –IMP’s should deliver standalone (but fully embedded) state of the art tools for physics exploration on current devices as well as for ITER –In addition, IMP’s should deliver validated modules for the integration projects providing a hierarchy of different physics fidelity modelling capacity to the ITM. ITPA NAKA, October 2007. P. Strand
17
Project relations Associations/Expertise User tools
18
ITPA NAKA, October 2007. P. Strand Integrated Modelling Project 1 Leader: G. Huysmans Deputy: C Konz Objective: –To provide an integrated suite of self-consistent codes (modules) for equilibrium reconstruction and linear MHD stability analysis Experimental Equilibrium reconstruction CEDRES, CLISTE, EFIT2006, EFIT-ITM, EQUINOX Equilibrium codes and linear MHD stability Equilibrium: CAXE, CHEASE, HELENA, FINESSE, (DINA) Mapping: COTRANS, JMC MHD Stability : CASTOR, KINX, MISHKA, PHOENIX Free-boundary direct equilibrium solvers CREATE, CEDRES, EFIT 3D equilibrium solvers VMEC Equilibrium toolbox FLUSH
19
ITPA NAKA, October 2007. P. Strand equilibrium and MHD Stability Standardise contributed codes to become independent of machine /diagnostic data. –Use only external geometry data (from database) –Definition of interfaces between codes and machine and diagnostics Validation and Verification –compare equilibrium and MHD stability codes on benchmark case –Apply codes to a relevant experimental problem/data –Ongoing discussion – independent experimental metrics challenging to find diagnostic(1) description machine description diagnostic(2) description magneticsMSE equilibrium description equilibrium reconstruction high resolution equilibrium code spec. parameters equilibrium description MHD stability code spec. parameters MHD output description
20
ITPA NAKA, October 2007. P. Strand Machine independent EFIT_ITM EFIT has been adapted to use the ITM structures and to use external geometry information –A unique version of EFIT can now be used for ITER, Tore Supra, JET, etc –Using only TF tools for Data storage, access and data structures Validation effort underway
21
ITPA NAKA, October 2007. P. Strand Objective: Non linear MHD phenomena Working to deliver a module for sawteeth, NTMs, RWMs and ELMs Sawtooth module ready for adaptation for Integrated Modelling Project 2 New Leader: M. Ottaviani Deputy: S. Sharapov Sawtooth model (Porcelli) already implemented in JETTO, ASTRA and CRONOS. Applied to the prediction of the sawtooth crash time, and comparison with JET data. F. Porcelli et al., FEC 2006, Chengdu IMP#2 is being restructured with a stronger focus on the delivery of tools.
22
ITPA NAKA, October 2007. P. Strand Integrated Modelling Project 3 Leader: D. Coster Deputy: V. Basiuk, D. Kalupin, V. Parail, G. Pereverzev Objective: –To provide the computational basis for a modular transport code, taking account of the core, the pedestal and the scrape-off layer. Ultimately, to enable the simulation of complete tokamak scenarios, e.g. for ITER The goal of IMP3 is to deliver a transport code framework that is based on the scientific workflow environment to be provided by ISIP: Adopt a modular approach to the construction of a transport code Communication between modules to be via the ITM agreed data- structures. The verification and validation would be a joint effort between the originating IMP providing modules and IMP3 Ultimate aim is to have a complete set of modules to enable the simulation of a full discharge (ie a “tokamak simulator”).
23
Kinetic Plasma Description IMP1IMP2IMP4 NBI Heating Modules ICRH Heating Modules ECRH Heating Modules IMP? IMP5 1d, fluid core IMP1IMP2IMP4 NBI Heating Modules ICRH Heating Modules ECRH Heating Modules IMP3 IMP5 Phase 0 Phase 1 Phase 2 1d, fluid core IMP1IMP2IMP4 NBI Heating Modules ICRH Heating Modules ECRH Heating Modules Kinetic Plasma Description IMP3 IMP5 European Transport Solver - ETS
24
ITPA NAKA, October 2007. P. Strand Common interface to transport models CALL ANOMALOUS(MODEL, PROFILES, GEOMETRY,TRANSPORT,[DIAG],ifail) Derived types: Standardized Inputs: PROFILES, GEOMETRY defined in generic modules (type definitions and allocations,…) Standardized Output: TRANSPORT defined in same generic modules (fluxes + eff.diff for transport channels) Model dependent data: WEILAND; GLF23, RITM, EDWM in specific model dependent module (MMM95 under testing) [DIAGNOSTIC]: Optional diagnostic output supplied in model dependent formats. Simple and extensible interface: New models need to supply 1.Default settings for options 2.A mapping to actual model call (by template) 3.Derived types for model specific inputs/outputs (may be empty!) developed with JET IM project Next step: Point-wise interface “profiles” -> “dimless”; collaborate with Lehigh/Facets For corrected GLF23 pointwise interface?
25
ITPA NAKA, October 2007. P. Strand Data structure defined for 1-D core transport code For every equation the substructure contains the information about : value (1-D profile of computed quantity); the source (any comments about source of 1- D profile); flag (integer number commenting how the 1-D profile was obtained: computed, imported from different code, taken from experiment, etc.); the boundary (substructure, specifying the type of the boundary conditions and the value); source_term (substructure, including the 1-D profile of the total source and describing how it was obtained); transp_coef (substructure, including 1-D profiles of total D and V, and describing the source of these quantities ); flux (substructure, containing predictive and interpretative fluxes); time_deriv (1-D profile of the integral of the time derivative term); codeparam (substructure, containing all internal parameters used by the transport solver)
26
ITPA NAKA, October 2007. P. Strand Integrated Modelling Project 4 Leader: B. Scott Deputy: M. Ottaviani Objective: To develop a suite of unified, validated codes to provide quantitative predictions for the linear properties of a range of instabilities, including: ion-temperature- gradient (ITG) modes, trapped electron modes (TEM), trapped ion modes (TIM), electron-temperature-gradient (ETG) modes, micro-tearing modes, etc. Three Formalised tasks: Catalogue Codes, standardise documentation Code verification and benchmarking Development and exchange of elements of theoretical basis for model Benchmarking campaign has started (will run for 2 years) Cyclone like base case and (unique to Europe) edge also comparing neoclassical and turbulence code equilibria
27
ITPA NAKA, October 2007. P. Strand IMP#4: Code Verification & Benchmarking Core and edge standard cases: L-mode edge near 100 eV and 2 x 10 13 cm -3 Standard Cyclone base case for core (local and global modes) First results reported at the EPS meeting in Rome 2006 global GEMR code differs at higher ne because it corrects for gradient relaxation
28
ITPA NAKA, October 2007. P. Strand Edge benchmarking first series: cold ion four field model in ASDEX- Upgrade L-Mode cases mainly flux tube codes, one global one in “thin atmosphere" domain 0.915 < r/a< 1.0 non-periodic: gradient relaxes, buffer diffusion zones -> dissipation periodic: obviously non-viable for global case, admits unphysical radial jet flows beta scan problematic enough to expose differences among the codes
29
ITPA NAKA, October 2007. P. Strand IMP#5, heating current drive and fast particles Leader: T. Hellsten Deputy: Y. Peysson, F. Zonca - Objective: develop the computational basis for a modular package of codes simulating heating, current drive and fast particle effects - Area covered: ECRH, ICRH, NBI, LH, alpha particle and fast particle interaction with instabilities - Goal: self-consistent calculations validated against experiments - Priority: realistic modelling applicable to ITER standard and advanced scenarios
30
ITPA NAKA, October 2007. P. Strand Integrated Modelling Project 5: strategy/outlook Short term (~6 months): Identify the most effective data structure for linking codes between them on the ITM Platform (almost done). Some benchmarking activity (to start soon). Medium term (~2/3 years): fast code development for basic ITER modelling ICRF, one code available, but some extension needed ECRH/ECCD, In good shape, several codes available. Benchmarking of these codes on ITM-TF platform. Long term (> 3 years): Development of modular advanced codes for comprehensive modelling and integration in specific problems. Several ICRF full wave codes available. Developments of 3D Fokker-Planck modelling. Improvements in plasma response functions. Description of AE modes etc. in Fokker-Planck treatment. …
31
ITPA NAKA, October 2007. P. Strand Predictive modelling of ITER scenarios Task on predictive modelling of ITER scenarios within the ITM TF – starting NOW –EU modelling tools already exists and are being employed towards scenario assessments and development (e.g. JETTO, CRONOS, ASTRA) for core plasma and B2.5/EIRENE and EDGE2D/EIRENE for the SOL. –Implementation as a Task within ITM TF gives Access to physics experts and codes Access to high level experimentalists (by call implementation) –Explore limitations of current codes and needs for a European Transport Solver for ITER –See V. Parail talk this meeting- Leader: V. Parail
32
ITPA NAKA, October 2007. P. Strand V&V and QA in General Level 0 – Ad Hoc approach (mainly at individual code level) Non systematic approach, metrics and reporting left to individuals No monitoring and forced adherence to any standards Level 1 – Consistent V&V (codes or packages ) Following predefined procedures Detailed consistent reporting Verified operation Critical assessment of “performance” Critical assessment of experimental applicability Openly reported in standardized formats Level 2 – Consistent QA (@ Organization or TF level) A superset of requirements over level 1 relating to Software Management Software Engineering Detailed procedure with checkpoints guaranteeing conformance to quality and reliability goals Level 3 – QA procedure under nuclear licensing requirements Possible ITER requirement for some operator codes ??
33
ITPA NAKA, October 2007. P. Strand Oversimplified view Qualification Verification Validation Computational model Conceptual model Plasma Data Validity Qualification: Is the physics description adequate? Verification: Are the equations implemented and solved for correctly? Validation: Do we have a reliable and sufficiently accurate description of the plasma? Data Validity: Is our measured data a sufficient representation of reality? Code benchmarking: (C2C) A tool in both V&V and physics exploration TF V&V procedures: EFDA-TF-ITM(04)-8 From Winter 1970’s – still valid!!!
34
Collaborations Recognized collaborations for ITM outside EFDA –ITER (EFDA/ELE, ITPA) –US BPO( CPES, CSWIM, FACETS) –JP BPSI (Broader Approach) –Kepler (UCSD) –Further collaborations as they are formalized under EFDA –EUFORIA (under final negotiation) EU FP7 Capacities project aimed at enhancing edge and core transport and turbulence modelling for ITER in EUROPE –Workflow, grid and visualization »May accelerate developments on portals/workflows »Will USE UAL as basis for data access –Designed to amend ITM-TF activities towards grid and HPC –EGEE, PACE, … ITPA NAKA, October 2007. P. Strand
35
Vision - Aims - Goals ITPA NAKA, October 2007. P. Strand The longer term aim is not a single (transport) code but a complete simulation environment providing consistent access to experimental and simulation data access to multiple levels of physics a flexible framework for building modelling applications validated modelling tools for (ITER) physics exploration a test bed for theory/code developments a range of state of the art physics components and tools Pan-European access should impact all aspects of the Associations modelling activities and experimental analysis and provide a basis for joint European exploitation of ITER and the basis for DEMO modelling development.
36
ITPA NAKA, October 2007. P. Strand
37
IMP1: the on-going tasks adaptation of high resolution equilibrium, mapping and linear MHD stability codes to the ITM data structures. –Status High resolution equilibrium codes CHEASE (H. Lutjens), HELENA (G. Huysmans, C. Konz), and CAXE (S. Medvedev) have been adapted to use ITM data structures. –codes have common interface –codes verification ongoing Linear MHD Stability codes CASTOR (C. Konz), KINX (S. Medvedev), MISHKA (G. Huysmans) have been adapted to use ITM data structures. –Verification on synthetic test case and ITER test case –CASTOR, MISHKA-1 and MISHKA-D have been combined into new framework ILSA (C. Konz, E. Strumberger, EPS2006) –To be done Verification on more synthetic benchmarks “Complete” set of Equilibrium tools and MHD analysis Ready to be launched on platform/Gateway
38
ITPA NAKA, October 2007. P. Strand Edge Code Benchmarking ITPA activity: –code-code comparison Phase I: pure D, no drifts Phase II: pure D, drifts Phase III: D+C, no drifts Phase IV: D+C, drifts –SOLPS-EDGE2D/NIMBUS Phase I completed successfully (reported on at PSI 2004) Phases II and III in progress –SOLPS-UEDGE Phase I underway Phase II expected to start soon Additional code-experiment –JET: SOLPS-EDGE2D/NIMBUS –AUG/D3D: SOLPS-UEDGE EDGE2D D+C EDGE2D D SOLPS EDGE2D
39
ITPA NAKA, October 2007. P. Strand Using MDS+ to connect SOLPS to ASCOT ITPA standard pedestal MDS+ tree created based on AUG shot 17151 (provides the equilibrium data) 2D background (n e, T e, T i, …) written as MDS+ tree by SOLPS (IPP-Garching) provides input to ASCOT. Due to open field lines, special care must be taken with the Monte Carlo collisions Planned: output of ASCOT to be saved to a MDS+ tree ASCOT: (Accelerated Simulation of Charged Particle Orbits in a Tokamak) T. Kurki-Suonio, L. Aho-Mantila, J. Heikkinen, V. Hynönen, T. Kiviniemi, A. Salmi, S. Sipilä, V. Tulkki Distribution of electrons reaching outer target
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.