Sparse linear solvers applied to parallel simulations of underground flow in porous and fractured media A. Beaudoin 1, J.R. De Dreuzy 2, J. Erhel 1 and.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

You have been given a mission and a code. Use the code to complete the mission and you will save the world from obliteration…
Feichter_DPG-SYKL03_Bild-01. Feichter_DPG-SYKL03_Bild-02.
1 Vorlesung Informatik 2 Algorithmen und Datenstrukturen (Parallel Algorithms) Robin Pomplun.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Finite Element Method CHAPTER 9: FEM FOR 3D SOLIDS
CHAPTER 1: COMPUTATIONAL MODELLING
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
1 Copyright © 2013 Elsevier Inc. All rights reserved. Appendix 01.
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 38.
and 6.855J Cycle Canceling Algorithm. 2 A minimum cost flow problem , $4 20, $1 20, $2 25, $2 25, $5 20, $6 30, $
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
0 - 0.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Addition Facts
Year 6 mental test 5 second questions
Numerische Simulation – Vom Modell zur Visualisierung, Ferienakademie 2005 Finite element discretisation finite difference and finite element discretisation.
Jean-Raynald de Dreuzy Géosciences Rennes, CNRS, FRANCE.
1 A parallel software for a saltwater intrusion problem E. Canot IRISA/CNRS J. Erhel IRISA/INRIA Rennes C. de Dieuleveult IRISA/INRIA Rennes.
A parallel scientific software for heterogeneous hydrogeoloy
Numerical simulation of solute transport in heterogeneous porous media A. Beaudoin, J.-R. de Dreuzy, J. Erhel Workshop High Performance Computing at LAMSIN.
BT Wholesale October Creating your own telephone network WHOLESALE CALLS LINE ASSOCIATED.
Break Time Remaining 10:00.
Reactive transport A COMPARISON BETWEEN SEQUENTIAL ITERATIVE AND GLOBAL METHODS FOR A REACTIVE TRANSPORT NUMERICAL MODEL J. Erhel INRIA - RENNES - FRANCE.
ABC Technology Project
Numerical Simulation of Complex and Multiphase Flows 18 th – 22 nd April Porquerolles 1/24 Finite volumes and finite elements for the numerical simulation.
1 Application of for Predicting Indoor Airflow and Thermal Comfort.
The challenge ahead: Ocean Predictions in the Arctic Region Lars Petter Røed * Presented at the OPNet Workshop May 2008, Geilo, Norway * Also affiliated.
15. Oktober Oktober Oktober 2012.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Squares and Square Root WALK. Solve each problem REVIEW:
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Chapter 5 Test Review Sections 5-1 through 5-4.
Addition 1’s to 20.
25 seconds left…...
1 Modal methods for 3D heterogeneous neutronics core calculations using the mixed dual solver MINOS. Application to complex geometries and parallel processing.
Week 1.
We will resume in: 25 Minutes.
Essential Cell Biology
Clock will move after 1 minute
PSSA Preparation.
Essential Cell Biology
Immunobiology: The Immune System in Health & Disease Sixth Edition
Energy Generation in Mitochondria and Chlorplasts
Select a time to count down from the clock above
Murach’s OS/390 and z/OS JCLChapter 16, Slide 1 © 2002, Mike Murach & Associates, Inc.
How to solve a large sparse linear system arising in groundwater and CFD problems J. Erhel, team Sage, INRIA, Rennes, France Joint work with A. Beaudoin.
1 High performance Computing Applied to a Saltwater Intrusion Numerical Model E. Canot IRISA/CNRS J. Erhel IRISA/INRIA Rennes C. de Dieuleveult IRISA/INRIA.
1 Numerical Simulation for Flow in 3D Highly Heterogeneous Fractured Media H. Mustapha J. Erhel J.R. De Dreuzy H. Mustapha INRIA, SIAM Juin 2005.
HYDROGEOLOGIE ECOULEMENT EN MILIEU HETEROGENE J. Erhel – INRIA / RENNES J-R. de Dreuzy – CAREN / RENNES P. Davy – CAREN / RENNES Chaire UNESCO - Calcul.
1 Modélisation et simulation appliquées au suivi de pollution des nappes phréatiques Jocelyne Erhel Équipe Sage, INRIA Rennes Mesures, Modélisation et.
High performance flow simulation in discrete fracture networks and heterogeneous porous media Jocelyne Erhel INRIA Rennes Jean-Raynald de Dreuzy Geosciences.
An efficient parallel particle tracker For advection-diffusion simulations In heterogeneous porous media Euro-Par 2007 IRISA - Rennes August 2007.
I DENTIFICATION OF main flow structures for highly CHANNELED FLOW IN FRACTURED MEDIA by solving the inverse problem R. Le Goc (1)(2), J.-R. de Dreuzy (1)
High Performance Computing 1 Parallelization Strategies and Load Balancing Some material borrowed from lectures of J. Demmel, UC Berkeley.
Direct and iterative sparse linear solvers applied to groundwater flow simulations Matrix Analysis and Applications October 2007.
1 Parallel Simulations of Underground Flow in Porous and Fractured Media H. Mustapha 1,2, A. Beaudoin 1, J. Erhel 1 and J.R. De Dreuzy IRISA – INRIA.
MUMPS A Multifrontal Massively Parallel Solver IMPLEMENTATION Distributed multifrontal.
A comparison between a direct and a multigrid sparse linear solvers for highly heterogeneous flux computations A. Beaudoin, J.-R. De Dreuzy and J. Erhel.
ParCFD Parallel computation of pollutant dispersion in industrial sites Julien Montagnier Marc Buffat David Guibert.
HYDROGRID J. Erhel – October 2004 Components and grids  Deployment of components  CORBA model  Parallel components with GridCCM Homogeneous cluster.
Hybrid Parallel Implementation of The DG Method Advanced Computing Department/ CAAM 03/03/2016 N. Chaabane, B. Riviere, H. Calandra, M. Sekachev, S. Hamlaoui.
Jean-Raynald de Dreuzy Philippe Davy Micas UMR Géosciences Rennes
Ph.D. Thesis Numerical Solution of PDEs and Their Object-oriented Parallel Implementations Xing Cai October 26, 1998.
Presentation transcript:

Sparse linear solvers applied to parallel simulations of underground flow in porous and fractured media A. Beaudoin 1, J.R. De Dreuzy 2, J. Erhel 1 and H. Mustapha IRISA / INRIA, Rennes, France 2 - Department of Geosciences, University of Rennes, France Matrix Computations and Scientific Computing Seminar Berkeley, 26 October 2005

2D heterogeneous porous medium Heterogeneous permeability field Y = ln(K) with correlation function Parallel Simulations of Underground Flow in Porous and Fractured Media

3D fracture network with impervious matrix Parallel Simulations of Underground Flow in Porous and Fractured Media length distribution has a great impact : power law n(l) = l - a 3 types of networks based on the moments of length distribution  mean  variation  third moment 3 < a < 4  mean  variation 2 < a < 3  mean  variation  third moment a > 4

 Equations Q = - K*grad (h) div (Q) = 0  Boundary conditions Flow model Fixed head Nul flux 3D fracture network Fixed head Nul flux 2D porous medium Parallel Simulations of Underground Flow in Porous and Fractured Media

Numerical method for 2D heterogeneous porous medium Parallel Simulations of Underground Flow in Porous and Fractured Media Finite Volume Method with a regular mesh Large sparse structured matrix with 5 entries per row

Parallel Simulations of Underground Flow in Porous and Fractured Media n=32 zoom Sparse matrix for 2D heterogeneous porous medium

Conforming triangular mesh Parallel Simulations of Underground Flow in Porous and Fractured Media Mixed Hybrid Finite Element Method with unstructured mesh Large sparse unstructured matrix with about 5 entries per row Numerical method for 3D fracture network

Parallel Simulations of Underground Flow in Porous and Fractured Media Sparse matrix for 3D fracture network N = 8181 Intersections and 7 fractures zoom

Memory requirements for matrices A and L Parallel Simulations of Underground Flow in Porous and Fractured Media Complexity analysis with PSPASES

CPU time of matrix generation, linear solving and flow computation obtained with two processors Parallel Simulations of Underground Flow in Porous and Fractured Media Complexity analysis with PSPASES

Parallel Simulations of Underground Flow in Porous and Fractured Media 2D porous medium : memory size and CPU time with PSPASES Theory : NZ(L) = O(N logN)Theory : Time = O(N 1.5 ) Slope about 1Slope about 1.5

Parallel Simulations of Underground Flow in Porous and Fractured Media 3D fracture network : memory size and CPU time with PSPASES NZ(L) = O(N) ?Time = O(N) ? Theory to be done

Parallel Simulations of Underground Flow in Porous and Fractured Media 2D porous medium : condition number estimated by MUMPS To be ckecked : scaling or not

Parallel Simulations of Underground Flow in Porous and Fractured Media 2D porous medium : residuals with PSPASES

Parallel architecture distributed memory 2 nodes of 32 bi – processors (Proc AMD Opteron 2Ghz with 2Go of RAM) Parallel architecture Parallel Simulations of Underground Flow in Porous and Fractured Media

Scalability analysis with PSPASES : speed-up Parallel Simulations of Underground Flow in Porous and Fractured Media

Scalability analysis with PSPASES : isoefficiency Parallel Simulations of Underground Flow in Porous and Fractured Media PNTpR , , PNTpR No value 2D medium3D fracture network

Parallel Simulations of Underground Flow in Porous and Fractured Media 2D porous medium : number of V cycles with HYPRE/SMG

Comparison between PSPASES and HYPRE/SMG : CPU time Parallel Simulations of Underground Flow in Porous and Fractured Media PSPASESHYPRE

Comparison between PSPASES and HYPRE/SMG : speed-up HYPRE PSPASES Parallel Simulations of Underground Flow in Porous and Fractured Media

Perspectives porous medium : large sigma, up to 9 and large N, up to 10 8 porous medium : 3D problems, N up to porous medium : scaling, iterative refinement, multigrid adapted to heterogeneous permeability field 3D fracture networks : large N, up to 10 9 model for complexity and scalability issues 2-level nested dissection subdomain method parallel architectures : up to 128 processors Monte-Carlo simulations grid computing with clusters for each random simulation parallel advection-diffusion numerical models