Results from the 3 rd Drag Prediction Workshop using the NSU3D Unstructured Mesh Solver Dimitri J. Mavriplis University of Wyoming.

Slides:



Advertisements
Similar presentations
Aerodynamic Characteristics of Airfoils and wings
Advertisements

Fast Adaptive Hybrid Mesh Generation Based on Quad-tree Decomposition
Instructor: André Bakker
Adjoint-based Unsteady Airfoil Design Optimization with Application to Dynamic Stall Karthik Mani Brian Lockwood Dimitri Mavriplis University of Wyoming.
CFD II w/Dr. Farouk By: Travis Peyton7/18/2015 Modifications to the SIMPLE Method for Non-Orthogonal, Non-Staggered Grids in k- E Turbulence Flow Model.
A Discrete Adjoint-Based Approach for Optimization Problems on 3D Unstructured Meshes Dimitri J. Mavriplis Department of Mechanical Engineering University.
The analysis of the two dimensional subsonic flow over a NACA 0012 airfoil using OpenFoam is presented. 1) Create the geometry and the flap Sequence of.
High Resolution Aerospace Applications using the NASA Columbia Supercomputer Dimitri J. Mavriplis University of Wyoming Michael J. Aftosmis NASA Ames Research.
Unstructured Mesh Related Issues In Computational Fluid Dynamics (CFD) – Based Analysis And Design Dimitri J. Mavriplis ICASE NASA Langley Research Center.
Adaptation Workshop > Folie 1 > TAU Adaptation on EC145 > Britta Schöning TAU Adaptation for EC145 Helicopter Fuselage Britta Schöning DLR –
Unstructured Mesh Discretizations and Solvers for Computational Aerodynamics Dimitri J. Mavriplis University of Wyoming.
MULTISCALE COMPUTATIONAL METHODS Achi Brandt The Weizmann Institute of Science UCLA
Chapter 1 Introduction to CFD
Anoop Samant Yanyan Zhang Saptarshi Basu Andres Chaparro
1 Multi-point Wing Planform Optimization via Control Theory Kasidit Leoviriyakit and Antony Jameson Department of Aeronautics and Astronautics Stanford.
Steady Aeroelastic Computations to Predict the Flying Shape of Sails Sriram Antony Jameson Dept. of Aeronautics and Astronautics Stanford University First.
Network and Grid Computing –Modeling, Algorithms, and Software Mo Mu Joint work with Xiao Hong Zhu, Falcon Siu.
Chamber Dynamic Response Modeling Zoran Dragojlovic.
Applications of Adjoint Methods for Aerodynamic Shape Optimization Arron Melvin Adviser: Luigi Martinelli Princeton University FAA/NASA Joint University.
1 CFD Analysis Process. 2 1.Formulate the Flow Problem 2.Model the Geometry 3.Model the Flow (Computational) Domain 4.Generate the Grid 5.Specify the.
Cornell University, September 17,2002 Ithaca New York, USA The Development of Unstructured Grid Methods For Computational Aerodynamics Dimitri J. Mavriplis.
Some Long Term Experiences in HPC Programming for Computational Fluid Dynamics Problems Dimitri Mavriplis University of Wyoming.
Comparison of Numerical Predictions and Wind Tunnel Results for a Pitching Uninhabited Combat Air Vehicle Russell M. Cummings, Scott A. Morton, and Stefan.
Hybrid WENO-FD and RKDG Method for Hyperbolic Conservation Laws
Surface Simplification Using Quadric Error Metrics Michael Garland Paul S. Heckbert.
Computational Aerodynamics Using Unstructured Meshes
2D unsteady computations for COSDYNA > Tony Gardner > Folie 1 2D unsteady computations with deformation and adaptation for COSDYNA Tony Gardner.
© 2011 Autodesk Freely licensed for use by educational institutions. Reuse and changes require a note indicating that content has been modified from the.
Wind Energy Program School of Aerospace Engineering Georgia Institute of Technology Computational Studies of Horizontal Axis Wind Turbines PRINCIPAL INVESTIGATOR:
Aerodynamic Shape Optimization of Laminar Wings A. Hanifi 1,2, O. Amoignon 1 & J. Pralits 1 1 Swedish Defence Research Agency, FOI 2 Linné Flow Centre,
Grid Quality and Resolution Issues from the Drag Prediction Workshop Series The DPW Committee Dimitri Mavriplis : University of Wyoming USA J. Vassberg,
CENTRAL AEROHYDRODYNAMIC INSTITUTE named after Prof. N.E. Zhukovsky (TsAGI) Multigrid accelerated numerical methods based on implicit scheme for moving.
Revisiting the Least-Squares Procedure for Gradient Reconstruction on Unstructured Meshes Dimitri J. Mavriplis National Institute of Aerospace Hampton,
High-Order Spatial and Temporal Methods for Simulation and Sensitivity Analysis of High-Speed Flows PI Dimitri J. Mavriplis University of Wyoming Co-PI.
CFD Lab - Department of Engineering - University of Liverpool Ken Badcock & Mark Woodgate Department of Engineering University of Liverpool Liverpool L69.
Grid Resolution Study of a Drag Prediction Workshop Configuration using the NSU3D Unstructured Mesh Solver Dimitri J. Mavriplis Department of Mechanical.
Modeling of Oscillating Flow Past a Vertical Plate Spyros A. Kinnas, Yi-Hsiang Yu, Hanseong Lee, Karan Kakar Ocean Engineering Group, Department of Civil.
Parallel Solution of the Poisson Problem Using MPI
College of Engineering and Natural Sciences Mechanical Engineering Department 1 Project Number : PS 7.1 Rotorcraft Fuselage Drag Study using OVERFLOW-D2.
Compressor Cascade Pressure Rise Prediction
ICEM CFD Meshes for Drag Prediction Workshop
Convection in Flat Plate Boundary Layers P M V Subbarao Associate Professor Mechanical Engineering Department IIT Delhi A Universal Similarity Law ……
FALL 2015 Esra Sorgüven Öner
Discretization Methods Chapter 2. Training Manual May 15, 2001 Inventory # Discretization Methods Topics Equations and The Goal Brief overview.
CFD Study of the Development of Vortices on a Ring Wing
DES Workshop, St. Petersburg, July 2./ DES at DLR Experience gained and Problems found K. Weinman, D.Schwamborn.
DLR Institute of Aerodynamics and Flow Technology 1 Simulation of Missiles with Grid Fins using an Unstructured Navier-Stokes solver coupled to a Semi-Experimental.
Dynamics of a Gas Bubble in an Inclined Channel at Finite Reynolds Number Catherine Norman Michael J. Miksis Northwestern University.
School of Aerospace Engineering MITE Numerical Simulation of Centrifugal Compressor Stall and Surge Saeid NiaziAlex SteinLakshmi N. Sankar School of Aerospace.
AIAA th AIAA/ISSMO Symposium on MAO, 09/05/2002, Atlanta, GA 0 AIAA OBSERVATIONS ON CFD SIMULATION UNCERTAINTIES Serhat Hosder, Bernard.
Code verification and mesh uncertainty The goal is to verify that a computer code produces the right solution to the mathematical model underlying it.
ANSYS, Inc. Proprietary © 2004 ANSYS, Inc. Chapter 5 Distributed Memory Parallel Computing v9.0.
By Arseniy Kotov CAL POLY San Luis Obispo, Aerospace Engineering Intern at Applied Modeling & Simulation Branch Mentors: Susan Cliff, Emre Sozer, Jeff.
Mesh Refinement: Aiding Research in Synthetic Jet Actuation By: Brian Cowley.
Review of Airfoil Aerodynamics
A V&V Overview of the 31st Symposium on Naval Hydrodynamics
Two-Stage Upscaling of Two-Phase Flow: From Core to Simulation Scale
Chamber Dynamic Response Modeling
Drag Prediction Using NSU3D (Unstructured Multigrid NS Solver)
DPW-4 Results For NSU3D on LaRC Grids
Computational Challenges (MC1) High-Lift Common Research Model
Convergence in Computational Science
AIAA OBSERVATIONS ON CFD SIMULATION UNCERTAINITIES
AIAA OBSERVATIONS ON CFD SIMULATION UNCERTAINTIES
Supported by the National Science Foundation.
Dimitri J. Mavriplis ICASE NASA Langley Research Center
Convergence in Numerical Science
AIAA OBSERVATIONS ON CFD SIMULATION UNCERTAINTIES
Accurate Flow Prediction for Store Separation from Internal Bay M
Presentation transcript:

Results from the 3 rd Drag Prediction Workshop using the NSU3D Unstructured Mesh Solver Dimitri J. Mavriplis University of Wyoming

Overview Description of Meshes Description of NSU3D Solver –Sample performance –Preliminary Sensitivity Evaluations WB and WBF Results W1 and W2 Results –Including runs performed at Cessna on 2 nd family of grids Conclusions

General Gridding Guidelines Grid Convergence Cases: –DLR F6 WBF 3 grid levels required –DLR F6 WB Medium grid required, coarse/fine optional –Wing1 and Wing2 Four grid levels required

General Gridding Guidelines Grid Resolution Guidelines –BL Region Y+ < 1.0, 2/3, 4/9, 8/27 (Coarse,Med,Fine,VeryFine) 2 cell layers constant spacing at wall Growth rates < 1.25 –Far Field: 100 chords –Local Spacings (Medium grid) Chordwise: 0.1% chord at LE/TE Spanwise spacing: 0.1% semispan at root/tip Cell size on Fuselage nose, tail: 2.0% chord –Trailing edge base: 8,12,16,24 cells across TE Base (Coarse,Med,Fine,Veryfine)

General Gridding Guidelines Grid Convergence Sequence –Grid size to grow ~3X for each level refinement 1.5X in each coordinate direction (structured) –Maintain same family of grids in sequence Same relative resolution/topology/growth factors –Sample sizes (DLR F6 WBF): 2.7M, 8M, 24M pts (structured grids) Unstructured grids should be similar –Cell based vs. Node Based Unstructured solvers –5 to 6 times more tetrahedra per nodes –2 times more prisms than nodes

Available (Posted) Unstructured Grids VGRID (NASA Langley) –Node-Based grids NASA(W1,W2,WB,WBF) –Node-Based grids Cessna (W1,W2) –Cell Centered Grids Raytheon (WB,WBF) ANSYS Hybrid Meshes Centaur (DLR, adapted) (Node Based) AFLR3 (Boeing) (Cell Centered) TAS (JAXA) (Node Based) GridPro (Block-Structured/Unstructured)

VGRID NASA (Node Based) WB: –Coarse : 5.3M pts –Medium: 14.3M pts –Fine: 40.0M pts (> 200M cells) WBF: –Coarse: 5.6M pts –Medium: 14.6M pts –Fine: 41.1M pts ( > 200M cells)

VGRID Node Centered (NASA)

NSU3D Description Unstructured Reynolds Averaged Navier- Stokes solver –Vertex-based discertization –Mixed elements (prisms in boundary layer) –Edge data structure –Matrix artificial dissipation Option for upwind scheme with gradient reconstruction –No cross derivative viscous terms Thin layer in all 3 directions Option for full Navier-Stokes terms

Solver Description (cont’d) Spalart-Allmaras turbulence model –(original published form) –Optional k-omega model

Solution Strategy Jacobi/Line Preconditioning –Line solves in boundary layer regions Relieves aspect ratio stiffness Agglomeration multigrid –Fast grid independent convergence rates Parallel implementation –MPI/OpenMP hybrid model DPW runs: MPI on local cluster and on NASA Columbia Supercomputer

Grid Generation Runs based on NASA Langley supplied VGRIDns unstructured grids Tetrahedra in Boundary Layer merged into prismatic elements Grid sizes up to 41M pts, 240M elements

Sample Run Times All runs performed on NASA Columbia Supercomputer –SGI Altix 512cpu machines –Coarse/Medium (~15Mpts) grids used 96 cpus Using 500 to 800 multigrid cycles –30 minutes for coarse grid –1.5 hrs for medium grid –Fine Grids (~40M pts) used 248 cpus Using 500 to 800 multigrid cycles –1.5 to 2 hrs hrs for fine grid –CL driver and constant incidence convergence similar –WB cases hard to converge (not entirely steady)

Scalability Near ideal speedup for 72M pt grid on 2008 cpus of NASA Columbia Machine (~10 minutes for steady-state solution)

NSU3D Sensitivity Studies Sensitivity to Distance Function Calculation Method Effect of Multi-Dimensional Thin-Layer versus Full Navier-Stokes Terms Sensitivity to Levels of Artificial Dissipation

Sensitivity to Distance Function All DPW3 Calculations done with Eikonal equation distance function

Sensitivity to Navier-Stokes Terms All DPW3 Calculations done with Multidimensional Thin-Layer Formulation

Sensitivity to Dissipation Levels Drag is grid converging Sensitivity to dissipation decreases as expected All Calculations done with low dissipation level

WBF Convergence (fixed alpha) “Similar” convergence for all grids Force coefficients well converged < 500 MG cycles

WBF Convergence Medium Grid (15M pts): Fixed alpha

WBF Convergence Medium Grid (15M pts): Fixed CL

WBF Convergence Similar convergence (Fixed CL or alpha)

WBF: Grid Convergence Study CP at wing break station (y/b=0.411)

WBF: Grid Convergence Study CP at wing break station (y/b=0.411)

WBF: Grid Convergence Study CP at wing break station (y/b=0.411)

WBF: Grid Convergence Study CF at wing break station (y/b=0.411)

WBF: Grid Convergence Study Good fairing design (coarse grid: 5M pts)

WBF: Grid Convergence Study Good fairing design (medium grid: 15M pts)

WBF: Grid Convergence Study Good fairing design (fine grid: 40M pts)

WBF: TE Separation Coarse grid: 5M pts

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CP at wing break station (y/b=0.411)

WBF: Drag Polar CFX at wing break station (y/b=0.411)

WBF: Drag Polar Full Polar run on all 3 grids (5, 15, 40M pts)

WBF: Drag Polar Full Polar run on all 3 grids (5, 15, 40M pts)

WBF: Moment Full Polar run on all 3 grids (5, 15, 40M pts)

WBF: Moment Full Polar run on all 3 grids (5, 15, 40M pts)

WB Convergence (fixed alpha) Separated Flow, unsteady shedding pattern Smaller residual excursions with fewer MG levels Moderate CL variations

WB Medium Grid Plot Min and Max unsteady CL values

WB Medium Grid Plot Min and Max unsteady CL values Good overlap in polar– suitable drag values

WB Medium Grid Plot Min and Max unsteady CL values Less overlap in CM

WB Medium Grid CP Values at Break Station (y/b=0.411)

WB Medium Grid CFX Values at Break Station (y/b=0.411)

WB Grid Convergence CP Values at Break Station (y/b=0.411)

WB Grid Convergence CFX Values at Break Station (y/b=0.411)

WB Grid Convergence Separation Pattern (Coarse grid : 5M pts)

WB Grid Convergence Separation Pattern (Medium grid : 5M pts)

WB Grid Convergence Separation Pattern (Fine grid : 40M pts)

WB TE Separation Pattern (Coarse grid : 5M pts)

Grid Convergence (WB+WBF) Grid convergence apparent (particularly for WBF)

Grid Convergence (WB+WBF) Some cancellation apparent: WBF less uniformly converging

Grid Convergence (WB+WBF) Grid Convergence Ranked 8 th in Vassberg Fig. of Merit: –Best for unstructured solvers ….. Importance of uniform family of grids

Grid Convergence (WB+WBF) Grid convergence apparent (in this measure)

Grid Convergence (WB+WBF) Grid convergence apparent (in this measure)

WBF-WB Differences Medium grid comparisons

WBF-WB Differences Medium grid comparisons

WBF-WB Differences Medium grid comparisons

Grid Convergence of Drag Increment Consistent with one group of DPW3 Entries

Conclusions WBF appears to be grid converging WB case is complex –Previous results showed importance of grid topology –New DPW3 grids are once again different DPW1,2,3 pushing s.o.f of grid resolution –DPW1: 1.6M pts –DPW2: 3M pts to 10M –DPW3: 5M to 40M pts

VGRID NASA (Node Based) W1: –Coarse : 1.8M pts –Medium: 4.5M pts –Fine: 11.5M pts –SuperFine: 36.9M pts W2: –Coarse: 1.9M pts –Medium: 4.7M pts –Fine: 11.9M pts –SuperFine: 38.5M pts

VGRID NASA (Cessna) W1: –Coarse : 0.98M pts –Medium: 2.4M pts –Fine: 6.1M pts –SuperFine: 12.7M pts W2: –Coarse: 0.95M pts –Medium: 2.3M pts –Fine: 5.9M pts –SuperFine: 12.4M pts

VGRID Node Centered (NASA)

W1 Convergence (fixed alpha=0.5) “Similar” convergence for coarse/med grids Apparent unsteadiness in residual for finest grid Force coefficients well converged < 500 MG cycles for all grids

W1 Grid Convergence Study CP at station 5:

W1 Grid Convergence Study CP at station 5:

W1 Grid Convergence Study CP at station 5:

W1 Grid Convergence Study CP at station 5:

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W1 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Convergence Study CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

W2 Grid Polar Sweep (Fine Grid) CP at station 5

Streamlines at 0.5 degrees (W1)

Streamlines at 0.5 degrees (W2)

W1-W2 Grid Polar Comparison(Fine Grid)

W1-W2 CL-Incidence Comparison(Fine Grid)

W1-W2 Moment Comparison (Fine Grid)

W1-W2 Grid Convergence Study Apparently uniform grid convergence

W1-W2 Grid Convergence Study Good grid convergence of individual drag component

W1-W2 Grid Convergence Study Ranked 1 st by Vassberg Figure-of-Merit

W1-W2 Grid Convergence Study

W1-W2 Results Discrepancy between UW and Cessna Results

W1-W2 Results Despite uniform grid convergence: Results on 2 grid families not converging to same values

W1-W2 Results Removing effect of lift-induced drag : Results on both grid families converge consistently

65M pt mesh Results 10% drop in C L at AoA=0 o : closer to experiment Drop in C D : further from experiment Same trends at Mach=0.3 Little sensitivity to dissipation

Summary W1-W2 appear to be in asymptotic grid convergence range –Cd difference ~ 1 count at 0.5 degrees Grids are getting finer …..40M pts ~1 hr on NASA Columbia Supercomputer Drag decomposition useful in providing better drag estimates on coarser grids