Introduction An important research activity in the area of global optimization is to determine an effective strategy for solving least squares problems that commonly arises in science and engineering. Our objectives are: To present a numerical comparison of optimization strategies applied to a nonzero residual problem. To introduce preliminary numerical results of a proposed novel algorithm that seems to work best. Numerical Comparison of Optimization Strategies for Solving a Nonzero Residual Nonlinear Least Squares Problem Brenda Bueno Advisors: Drs. L. Velázquez and M. Argáez Department of Mathematical Sciences University of Texas at El Paso Sponsored by the Minority Access to Research Careers Program and the US Department of the Army DAAD Global Optimization Problem Types of minima Global Minima for Least Squares Problems Problem Data and Model Hyperboloid Least Squares Problem Our Goal Numerical Results Future Work 1) To add a multistart technique 2) To improve the rate of convergence of the algorithm 3) To include constraints on the variables 4) To test the algorithm on more problems Contact Information Brenda Bueno, Undergraduate Student University of Texas at El Paso Department of Mathematical Sciences 500 W. University Avenue El Paso, Texas USA Phone: (915) Fax: (915) Office: Bell Hall, Room 215 Information given: The positions of 40 atoms corresponding to the selected beta sheets of the protein of interest [x i,y i,z i ] =[ atom atom atom 3 ……………………………… ……… atom ] atom 40] where A is the 3x3 rotation matrix: where the unknown parameters are given by Calculated Data Observed Data To find the global minimum w* of Optimization Strategy CPU time in seconds Iterations for convergence Approximated Solution w* Accepted/ Rejected Solution by Chemists Newton’s Method e e e e e e10 Rejected due to large value of certain parameters Newton’s Method using forward difference approximation of the second derivative e e e e e e11 Rejected due to large value of certain parameters Gauss-Newton Method e e e e e e Rejected due to large value of certain parameters Levenberg Marquadt This technique does not require second order information *Lowest function value obtained 2.54e Accepted Initial point: w o = [1; 1; 1; 5; 8; 20; -10; -10; -12] Optimization Strategy CPU time in second s Iterations for converge nce w* (refined solution) Levenberg Marquadt & Newton’s Method e Levenberg Marquadt & Newton’s Method using forward difference e8.82e-7 same solution as above Refinement Stage Initial point: corresponding w* from Levenberg-Marquadt Method 1.Compute 2.Compute the Hessian, H, according to a chosen methodology i) Newton: ii)Gauss-Newton: iii)Levenberg-Marquadt: iv)Finite-Difference: 3.Solve 4.Update 5.Check convergence 6. Else, and go to step 1 Given an initial vector, a maximum number of iterations and tolerance of Do the following: Algorithm Residual = Calculated Data - Observed Data Nonzero Residual Problem The nonlinear residual function is calculated by: We are interested in least squares problems where the function at the global minimum w* will never be zero, i.e. Acknowledgment We thank E. Tolonen, S. Kulshreshtha, and B. Stec from the Macromolecular Crystallography Lab, Chemistry Department, UTEP, for providing the problem formulation, data and revising the approximated solutions. Main Modification