Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/20 Accelerating minimizations in ensemble variational assimilation G. Desroziers, L. Berre Météo-France/CNRS (CNRM/GAME)

Similar presentations


Presentation on theme: "1/20 Accelerating minimizations in ensemble variational assimilation G. Desroziers, L. Berre Météo-France/CNRS (CNRM/GAME)"— Presentation transcript:

1 1/20 Accelerating minimizations in ensemble variational assimilation G. Desroziers, L. Berre Météo-France/CNRS (CNRM/GAME)

2 2/20 Outline 1.Ensemble Variational assimilation 2.Accelerating minimizations 3.Conclusion and future work

3 3/20 Outline 1.Ensemble Variational assimilation 2.Accelerating minimizations 3.Conclusion and future work

4 4/20 Simulation of analysis errors : «hybrid EnKF/Var» or «consistent ensemble 4D-Var»? A consistent 4D-Var approach can be used in both ensemble and deterministic components :  Simple to implement (perturbed Var ~ unperturbed Var).  A full-rank hybrid B is consistently used in the 2 parts.  Non-linear aspects of 4D-Var can be represented (outer loop).

5 5/20 The operational Météo-France ensemble Var assimilation  Six perturbed global members, T399 L70 (50 km / 10 km), with global 4D-Var Arpege.  Spatial filtering of error variances, to further increase the sample size and robustness.  Inflation of ensemble B / model error contributions, soon replaced by inflation of perturbations.

6 6/20 Applications of the EnDA system at Météo-France  Flow-dependent background error variances in 4D-Var.  Flow-dependent background error correlations experimented using wavelet filtering properties (Varella et al 2011 a,b).  Initialisation of Météo-France ensemble prediction by EnDA.  Diagnostics on analysis consistency and observation impact (Desroziers et al 2009).

7 7/20 Background error standard deviations  Connexion between large  b’s and intense weather ( Klaus storm, 24/01/2009, 00/03 UTC ) Mean sea level pressure field

8 8/20 Background error correlations using EnDA and wavelets Wavelet-implied horizontal length-scales (in km), for wind near 500 hPa, averaged over a 4-day period. (Varella et al 2011b, and also Fisher 2003, Deckmyn and Berre 2005, Pannekoucke et al 2007)

9 9/20 Outline 1.Ensemble Variational assimilation 2.Accelerating minimizations 3.Conclusion and future work

10 10/20 Ensemble variational assimilation  Ens. variational assimilation (similar to EnKF): for l = 1, L (size of ensemble)   a l = (I – KH)  b l + K  o l,  with H observation operator matrix, K implicit gain matrix,  o l = R 1/2  o l and  o l a vector of random numbers  b+ l = M  a l +  m l,  with  m l = Q ½  m l,  m l a vector of random numbers  and Q model error covariance matrix.  Minimize L cost-functions J l, with perturbed innovations d l : J l (  x l ) = 1/2  x l T B -1  x l + 1/2 (d l -H l  x l ) T R -1 (d l -H l  x l ), with B and R bg and obs. error matrices, and d l = y o + R 1/2  o l - H(M(x b l )), x a l = x b l +  x l and x b l + = M( x a l ) + Q 1/2  m l.

11 11/20 Hessian matrix of the assimilation problem  Hessian of the cost-function: J’’ = B -1 + H T R -1 H.  Bad conditioning of J’’: very slow (or no) convergence.  Cost-function with B 1/2 preconditioning (  x = B 1/2  ): J(  ) = 1/2  T  + 1/2 (d-H B 1/2  ) T R -1 (d-H B 1/2  )  Hessian of the cost-function: J’’ = I + B T/2 H T R -1 H B 1/2.  Far better conditioning and convergence! (Lorenc 1988, Haben et al 2011)

12 12/20 Lanczos algorithm  Generate iteratively a set of k orthonormal vectors q such as Q k T J’’ Q k = T k, where Q k = (q 1 q 2 … q k ), and T k is a tri-diagonal matrix.  The extremal eigenvalues of T k quickly converge towards the extremal eigenvalues of J’’.  If T k = Y k  k Y k T is the eigendecomposition of T k, the Ritz vectors are obtained with Z k = Q k Y k and the Ritz pairs (z k, k ) approximate the eigenpairs of J’’.

13 13/20 Lanczos algorithm / Conjugate gradient  Use of the Lanczos vectors to get the solution of the variational problem:  k =  0 + Q k  k.  Optimal coefficients  k should make the gradient of J vanish at  k : J’(  k )= J’(  0 ) + J’’ (  k -  0 ) = J’(  0 ) + J’’ Q k  k = 0, which gives  k = - ( Q k T J’’ Q k ) -1 Q k T J’(  0 ) = - T k -1 J’(  0 ), and then  k =  0 - Q k T k -1 Q k T J’(  0 ).  Same solution as after k iterations of a Conjugate Gradient algorithm. (Paige and Saunders 1975, Fisher 1998)

14 14/20  Minimizations with - unperturbed innovations d and - perturbed innovations d l have basically the same Hessians: J’’(d) = I + B T/2 H T R -1 H B 1/2, J’’(d l ) = I + B T/2 H l T R -1 H l B 1/2,  The solution obtained for the « unperturbed » problem  k =  0 - Q k ( Q k T J’’ Q k ) -1 Q k T J’(  0, d) can be transposed to the « perturbed » minimization  k,l =  0 - Q k ( Q k T J’’ Q k ) -1 Q k T J’(  0, d l ) to improve its starting point. Accelerating a « perturbed » minimization using « unperturbed » Lanczos vectors

15 15/20 Accelerating a « perturbed » minimization using « unperturbed » Lanczos vectors Perturbed analysis dashed line : starting point with 50 Lanczos vectors

16 16/20 Accelerating a « perturbed » minimization using « unperturbed » Lanczos vectors Decrease of the cost function for a new « perturbed » minimization 1 stand-alone minim. 10 vectors 50 vectors 100 vectors

17 17/20 Accelerating minimizations using « perturbed » Lanczos vectors  If L perturbed minim. with k iterations have already been performed, then the starting point of a perturbed (or unperturbed) minimization can be written under the form  k =  0 + Q k,L  k,L, where  k,L is a vector of k x L coefficients and Q k,L = (q 1,1 … q k,1 … q 1,L … q k,L ) is a matrix containing the k x L Lanczos vectors.  Following the same approach as above, the solution can be expressed and computed:  k,L =  0 – Q k,L ( Q k,L T J’’ Q k,L ) -1 Q k,L T J’(  0 ).  Matrix Q k,L T J’’ Q k,L is no longer tri-diagonal, but can be easily inverted.

18 18/20 1 stand-alone minim. 10x1 = 10 «synergetic» vectors 50x1 = 50 «synergetic» vectors Decrease of the cost function for a new « perturbed » minimization Accelerating minimizations using « perturbed » Lanczos vectors (k = 1) 100x1 = 100 «synergetic» vectors

19 19/20 Outline 1.Ensemble Variational assimilation 2.Accelerating minimizations 3.Conclusion and future work

20 20/20 Conclusion and future work  Ensemble Variational assimilation: error cycling can be simulated in a way consistent with 4D-Var.  Flow-dependent covariances can be estimated.  Positive impacts, in particular for intense/severe weather events, from both flow-dependent variances and correlations.  Accelerating minimizations seems possible (preliminary tests in the real size Ens. 4D-Var Arpege also encouraging).  Connection with Block Lanczos / CG algorithms (O’Leary 1980).  Possible appl. in EnVar without TL/AD (Lorenc 2003, Buehner 2005).


Download ppt "1/20 Accelerating minimizations in ensemble variational assimilation G. Desroziers, L. Berre Météo-France/CNRS (CNRM/GAME)"

Similar presentations


Ads by Google