Optimization

Optimization

NOTE: Optimization has been replaced with Optimization MT. All new programs should be written with OPTMT.

Optimization is intended for the optimization of functions. It has many features, including a wide selection of descent algorithms, step-length methods, and "on-the-fly" algorithm switching. Default selections permit you to use Optimization with a minimum of programming effort. All you provide is the function to be optimized and start values, and Optimization does the rest.

Features

  • More than 25 options can be easily specified by the user to control the optimization
  • Descent algorithms include: BFGS, DFP, Newton, steepest descent, and PRCG
  • Step length methods include: STEPBT, BRENT, and a step-halving method
  • A "switching" method may also be selected which switches the algorithm during the iterations according to two criteria: number of iterations, or failure of the function to decrease within a tolerance

Improved Algorithm

Optimization implements the numerically superior Cholesky factorization, solve and update methods for the BFGS, DFP, and Newton algorithms. The Hessian, or its estimate, are updated rather than the inverse of the Hessian, and the descent is computed using a solve. This results in better accuracy and improved convergence over previous methods.

Platform: Windows, Mac and Linux.

Requirements: GAUSS/GAUSS Light version 8.0 or higher.