# Constrained Optimization MT

**Constrained Optimization MT (COMT)** solves the Nonlinear Programming problem, subject to general constraints on the parameters - linear or nonlinear, equality or inequality, using the Sequential Quadratic Programming method in combination with several descent methods selectable by the user.

**COMT’s** ability to handle general nonlinear functions and nonlinear constraints along with other features, such as the Trust Region Method, allow you to solve a wide range of sophisticated optimization problems. Built on the speed and number crunching ability of the **GAUSS** platform, **COMT** quickly computes solutions to large problems, making it ideal for large scale Monte Carlo or Bootstrap optimizations.

## Version 2.0 is easier to use than ever!

- New syntax options eliminate the need for PV and DS structures:
- Decreasing the required code up to 25%.
- Decreasing runtime up to 20%.
- Simplifying usage.

- Optional dynamic arguments make it simple and transparent to add extra data arguments beyond model parameters to your objective function.
- Updated documentation and examples.
- Fully backwards compatible with
**COMT**1.0

## Key Features

### Descent methods

- BFGS (Broyden, Fletcher, Goldfarb and Powell)
- Modifed BFGS
- DFP (Davidon, Fletcher and Powell)
- Newton-Raphson

### Line search methods

- Augmented trust region method
- STEPBT
- Brent’s method
- Half
- Strong Wolfe’s conditions

### Constraint types

- Linear equality and inequality constraints
- Nonlinear equality and inequality constraints
- Bounded parameters

## Advantages

### Flexible

- Specify fixed and free parameters.
- Dynamic algorithm switching.
- Compute all, a subset, or none of the derivatives numerically.
- Easily pass data other than the model parameters as extra input arguments

### Efficient

- Threaded and thread-safe
- Option to avoid computations that are the same for the objective function and derivatives.
- The tremendous speed of user-defined procedures in GAUSS speeds up your optimization problems.

### Trusted

- For more than 30 years, leading researchers have trusted the efficient and numerically sound code in the GAUSS optimization packages to keep them at the forefront of their fields.

## Details

Control options | |
---|---|

Linear equality constraints | Optional, simple to specify, linear equality constraints. |

Linear inequality constraints | Optional, simple to specify, linear inequality constraints. |

Nonlinear equality constraints | Option to provide procedure to compute nonlinear equality constraints. |

Nonlinear inequality constraints | Option to provide procedure to compute nonlinear inequality constraints. |

Parameter bounds | Simple parameter bounds of the type: lower_bd ≤ x_i ≤ upper_bd |

Feasible test | Controls whether parameters are checked for feasibility during line search. |

Trust radius | Set the size of the trust radius, or turn off the trust region method. |

Descent algorithms | BFGS, Modified BFGS, DFP and Newton. |

Algorithm switching | Specify descent algorithms to switch between based upon the number of elapsed iterations, a minimum change in the objective function or line search step size. |

Line search method | Augmented Lagrangian Penalty, STEPBT (quadratic and cubic curve fit), Brent’s method, half-step or Strong Wolfe’s Conditions. |

Active parameters | Control which parameters are active (to be estimated) and which should be fixed to their start value. |

Gradient Method | Either compute an analytical gradient, or have COMT compute a numerical gradient using the forward, central or backwards difference method. |

Hessian Method | Either compute an analytical Hessian, or have COMT compute a numerical Hessian using the forward, central or backwards difference method. |

Gradient check | Compares the analytical gradient computed by the user supplied function with the numerical gradient to check the analytical gradient for correctness. |

Random seed | Starting seed value used by the random line search method to allow for repeatable code. |

Print output | Controls whether (or how often) iteration output is printed and whether a final report is printed. |

Gradient step | Advanced feature: Controls the increment size for computing the step size for numerical first and second derivatives. |

Random search radius | The radius of the random search if attempted. |

Maximum iterations | Maximum iterations to converge. |

Maximum elapsed time | Maximum number of minutes to converge. |

Maximum random search attempts | Maximum allowed number of random line search attempts. |

Convergence tolerance | Convergence is achieved when the direction vector changes less than this amount. |

## Examples

Below is the code to compute a simple quadratic programming problem.

```
//Make comt library available
library comt;
//Create data needed by objective procedure
Q = { 0.78 -0.02 -0.12 -0.14,
-0.02 0.86 -0.04 0.06,
-0.12 -0.04 0.72 -0.08,
-0.14 0.06 -0.08 0.74 };
b = { 0.76, 0.08, 1.12, 0.68 };
//Objective procedure with 4 inputs:
// i. x - The parameter vector
// ii-iii. Q and b - Extra data needed by the objective procedure
// ii. ind - The indicator vector
proc qfct(x, Q, b, ind);
//Declare 'mm' to be a modelResults
//struct local to this procedure, 'qfct'
struct modelResults mm;
//If the first element of the indicator
//vector is non-zero, compute function value
//and assign it to the 'function' member
//of the modelResults struct
if ind[1];
mm.function = 0.5*x'*Q*x - x'b;
endif;
//Return modelResults struct
retp(mm);
endp;
//Starting parameter values
x_strt = { 1, 1, 1, 1 };
//Minimize objective function
//and print results
call comtPrt(comt(&qfct,x_strt,Q,b));
```

The above code will produce the following report:

============================= COMT Version 2.0.1 ============================= Return code = 0 Function value = -2.17466 Convergence : normal convergence Parameters Estimates Gradient ----------------------------------- x[1,1] 1.5350 0.0000 x[2,1] 0.1220 0.0000 x[3,1] 1.9752 0.0000 x[4,1] 1.4130 0.0000 Number of iterations 16 Minutes to convergence 0.00005

**Platform:** Windows, Mac, and Linux

**Requirements:** GAUSS/GAUSS Light version 16 or higher