Yi Heng Second Order Differentiation Bommerholz – 14.08.2006 Summer School 2006.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

University of Karlsruhe September 30th, 2004 Masayuki Fujita
You have been given a mission and a code. Use the code to complete the mission and you will save the world from obliteration…
Advanced Piloting Cruise Plot.
McGraw-Hill/Irwin Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 4 Future Value, Present Value and Interest Rates.
© 2008 Pearson Addison Wesley. All rights reserved Chapter Seven Costs.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Chapter 1 The Study of Body Function Image PowerPoint
1 Copyright © 2013 Elsevier Inc. All rights reserved. Appendix 01.
Algebraic Expressions
UNITED NATIONS Shipment Details Report – January 2006.
Summary of Convergence Tests for Series and Solved Problems
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Title Subtitle.
Exit a Customer Chapter 8. Exit a Customer 8-2 Objectives Perform exit summary process consisting of the following steps: Review service records Close.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Year 6 mental test 10 second questions
Solve Multi-step Equations
REVIEW: Arthropod ID. 1. Name the subphylum. 2. Name the subphylum. 3. Name the order.
Pole Placement.
7 Applications of Integration
Copyright © Cengage Learning. All rights reserved.
ABC Technology Project
MAT 205 F08 Chapter 12 Complex Numbers.
1 Undirected Breadth First Search F A BCG DE H 2 F A BCG DE H Queue: A get Undiscovered Fringe Finished Active 0 distance from A visit(A)
VOORBLAD.
15. Oktober Oktober Oktober 2012.
1. 2 No lecture on Wed February 8th Thursday 9 th Feb 14: :00 Thursday 9 th Feb 14: :00.
Name Convolutional codes Tomashevich Victor. Name- 2 - Introduction Convolutional codes map information to code bits sequentially by convolving a sequence.
Solving Equations How to Solve Them
1 Breadth First Search s s Undiscovered Discovered Finished Queue: s Top of queue 2 1 Shortest path from s.
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Copyright © 2013, 2009, 2006 Pearson Education, Inc.
P Preparation for Calculus.
Factor P 16 8(8-5ab) 4(d² + 4) 3rs(2r – s) 15cd(1 + 2cd) 8(4a² + 3b²)
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
© 2012 National Heart Foundation of Australia. Slide 2.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Copyright © 2013, 2009, 2006 Pearson Education, Inc. 1 Section 5.4 Polynomials in Several Variables Copyright © 2013, 2009, 2006 Pearson Education, Inc.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
6.4 Best Approximation; Least Squares
Chapter 5 Test Review Sections 5-1 through 5-4.
What You Should Learn • Represent and classify real numbers.
Addition 1’s to 20.
25 seconds left…...
Slippery Slope
Exponential and Logarithmic Functions
Chapter 6 Equations 6.1 Solving Trigonometric Equations 6.2 More on Trigonometric Equations 6.3 Trigonometric Equations Involving Multiples Angles 6.4.
Rational Functions and Models
H to shape fully developed personality to shape fully developed personality for successful application in life for successful.
Januar MDMDFSSMDMDFSSS
Copyright © Cengage Learning. All rights reserved.
Week 1.
We will resume in: 25 Minutes.
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
PSSA Preparation.
Essential Cell Biology
Maria Ugryumova Direct Solution Techniques in Spectral Methods CASA Seminar, 13 December 2007.
Sections 5.1 & 5.2 Inequalities in Two Variables
State Variables.
Copyright © Cengage Learning. All rights reserved.
Copyright © Cengage Learning. All rights reserved.
Derivatives and Gradients
Presentation transcript:

Yi Heng Second Order Differentiation Bommerholz – Summer School 2006

2 Outline Background What are derivatives? Where do we need derivatives? How to compute derivatives? Basics of Automatic Differentiation Introduction Forward mode strategy Reverse mode strategy Second-Order Automatic Differentiation Module Introduction Forward mode strategy Taylor Series strategy Hessian Performance An Application in Optimal Control Problems Summary

3 Background What are derivatives? Jacobian Matrix:

4 Background What are derivatives? Hessian Matrix:

5 Background Linear approximation Bending and Acceleration (Second derivatives) Solve algebraic and differential equations Curve fitting Optimization Problems Sensitivity analysis Inverse Problem (data assimilation) Parameter identification Where do we need derivatives?

6 Background How to compute derivatives? Computation work is expensive For complicated functions, the representation of the final expression may be an unaffordable overhead Symbolic differentiation Divided differences The approximation contains truncation error Derivatives can be computed to machine precision It is easy to implement (the definition of derivative is used) Only original computer program is required (Formula not necessary)

7 Background How to compute derivatives? Automatic differentiation Computation work is cheaper Only requires the original computer program Machine precision approximation can be obtained To be continued...

8 Basics of Automatic Differentiation Introduction Automatic differentiation... Is also known as computational differentiation, algorithmic differentiation, and differentiation of algorithms; Is a systematic application of the familar rules of calculus to computer programs, yielding programs for the propagation of numerical values of first, second, or higher order derivatives; Traverses the code list (or computational graph) in the forward mode, the reverse mode, or a combination of the two; Typically is implemented by using either source code transformation or operator overloading; Is a process for evaluating derivatives which depends only on an algorithmic specification of the function to be differentiated.

9 Basics of Automatic Differentiation Introduction Rules of arithmetic operations for gradient vector

10 Basics of Automatic Differentiation Forward mode & Reverse mode An example

11 Basics of Automatic Differentiation Forward mode & Reverse mode - Forward mode Code list Gradient entries+

12 Basics of Automatic Differentiation Forward mode & Reverse mode - Reverse mode Code list Adjoints+

13 Second-Order AD Module Introduction Divided differences

14 Second-Order AD Module Introduction Rules of arithmetic operations for Hessian matrices

15 Second-Order AD Module Forward Mode Stategy An example

16 Second-Order AD Module Forward Mode Stategy Code list Gradient entries+

17 Second-Order AD Module Forward Mode Stategy... Hessian matrix entries +

18 Second-Order AD Module Forward Mode Stategy Hessian TypeCost H(f)O(n 2 ) H(f)*VO(nnv) V T H(f)*VO(nv 2 ) V T H(f)*WO(nvnw) H(f): n by n matrix V: n by nv matrix W: n by nw matrix

19 Second-Order AD Module Taylor Series Strategy

20 Second-Order AD Module Taylor Series Strategy

21 Second-Order AD Module Hessian Performance Twice ADIFOR, first produces a gradient code with ADIFOR 2.0, and then runs the gradient code through ADIFOR again. Forward, implements the forward mode. Adaptive Forward, uses the forward mode, with preaccumulation at a statement level where deemed appropriate. Sparse Taylor Series, uses the Taylor series mode to compute the needed entries.

22 An Application in OCPs Problem Definition and Theoretical Analysis

23 An Application in OCPs The first order sensitivity equations

24 An Application in OCPs The second order sensitivity equations

25 An Application in OCPs The second order sensitivity equations

26 An Application in OCPs Optimal control problem

27 An Application in OCPs Truncated Newton method for the solution of the NLP

28 An Application in OCPs Implementation Details Step 1 Automatic derivation of the first and second order sensitivity equations to construct a full augmented IVP. Creates corresponding program subroutines in a format suitable to a standard IVP solver. Step 2 Numerical solution of the outer NLP using a truncated –Newton method which solves bound-constrained problems.

29 An Application in OCPs Two approaches with TN method TN algorithm with finite difference scheme TN algorithm with the exact Hessian vector product calculation It uses the second order sensitivity equations defined in Eq. (5a) to obtain the exact Hessian vector product. (Earlier methods of the CVP type were based on first order sensitivities only, i.e. Gradient based algorithms mostly). This approach has been shown more robust and reliable due to the use of exact second order information. Gradient evaluation requires the solution of the first order sensitivity system Gradient information is used to approximate the Hessian vector product with a finite difference scheme

30 Summary Basics of derivatives -Definition of derivatives -Application of derivatives -Methods to compute derivatives Basics of AD -Compute first order derivatives with forward mode -Compute first order derivatives with reverse mode Second Order Differentiation -Compute second order derivatives with forward mode strategy -Compute second order derivatives with Taylor Series strategy -Hessian Performance An Application in Optimal Control Problems -First order and second order sensitivity equations of DAE -Solve optimal control problem with CVP method -Solve nonlinear programming problems with truncated Newton method -Truncated Newton method with exact Hessian vector product calculation

31 References Abate, Bischof, Roh,Carle "Algorithms and Design for a Second-Order Automatic Differentiation Module Eva Balsa Canto, Julio R. Banga, Antonio A. Alonso, Vassilios S. Vassiliadis "Restricted second order information for the solution of optimal control problems using control vector parameterization Louis B. Rall, George F. Corliss An Introduction to Automatic Differentiation Andreas Griewank Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation Stephen G. Nash A Survey of Truncated-Newton Methods