## Constrained Optimization MT

**COMT** solves the Nonlinear Programming problem, subject to general constraints on the parameters - linear or nonlinear, equality or inequality, using the *Sequential Quadratic Programming method* in combination with several descent methods selectable by the user:

*Newton-Raphson**quasi-Newton (BFGS and DFP)*- Scaled
*quasi-Newton*

There are also several selectable line search methods. A *Trust Region *method is also available which prevents saddle point solutions. Gradients can be user-provided or numerically calculated.** COMT** is fast and can handle large, time-consuming problems because it takes advantage of the speed and number-crunching capabilities of GAUSS. It is thus ideal for large scale Monte Carlo or bootstrap simulations.

**New Features**

- Internally threaded functions
- Uses structures
- Improved algorithm
- Allows for computing a subset of the derivatives analytically, and for combining the calculation of the function and derivatives, thus reducing calculations in common between function and derivatives

**Threading in COMT**

If you have a multi-core processor you may take advantage of COMT's internally threaded functions. An important advantage of threading occurs in computing numerical derivatives. If the derivatives are computed numerically, threading will significantly decrease the time of computation.

**Example:**

We ran a time trial of a covariance-structure model on a quad-core machine. As is the case for most real world problems, not all sections of the code are able to be run in parallel. Therefore, the theoretical limit for speed increase is much less than (single-threaded execution time)/(number of cores).

Even so, the execution time of our program was cut dramatically:

Single-threaded execution time: 35.42 minutes

Multi-threaded execution time: 11.79 minutes

That is a nearly 300% speed increase!

**The DS Structure**

**COMT** uses the DS and **PV** structures that are available in the GAUSS Run-Time Library.

The DS structure is completely flexible, allowing you to pass anything you can think of into your procedure. There is a member of the structure for every GAUSS data type.

`struct DS {`

scalar type;

matrix dataMatrix;

array dataArray;

string dname;

string array vnames;

};

**The PV Structure**

The PV structure revolutionizes how you pass the parameters into the procedure. No longer do you have to struggle to get the parameter vector into matrices for calculating the function and its derivatives, trying to remember, or figure out, which parameter is where in the vector.

If your log-likelihood uses matrices or arrays,you can store them directly into the PV structure and remove them as matrices or arrays with the parameters already plugged into them. The PV structure can handle matrices and arrays in which some of their elements are fixed and some free. It remembers the fixed parameters and knows where to plug in the current values of the free parameters. It can also handle symmetric matrices in which parameters below the diagonal are repeated above the diagonal.

garch - GARCH parameters.

arch - ARCH parameters.

omega - Constant in variance equation.

There is no longer any need to use global variables. Anything the procedure needs can be passed into it through the DS structure. And these new applications uses control structures rather than global variables. This means, in addition to thread safety, that it is straightforward to nest calls to COMT inside of a call to COMT, QNewtonmt, QProgmt, or EQsolvemt.

**Example:**

A Markowitz mean/variance portfolio allocation analysis on a thousand or more securities would be an example of a large scale problem COMT could handle.

COMT also contains a special technique for semi-definite problems, and thus it will solve the Markowitz portfolio allocation problem for a thousand stocks even when the covariance matrix is computed on fewer observations than there are securities.

Because COMT handles general nonlinear functions and constraints, it can solve a more general problem than the Markowitz problem. The efficient frontier is essentially a quadratic programming problem where the Markowitz mean/variance portfolio allocation model is solved for a range of expected portfolio returns which are then plotted against the portfolio risk measured as the standard deviation:

where *l* is a conformable vector of ones, and where is the observed covariance matrix of the returns of a portfolio of securities, and µ are their observed means.

This model is solved for

and the efficient frontier is the plot of on the vertical axis against

on the horizontal axis. The portfolio weights in describe the optimum distribution of portfolio resources across the securities given the amount of risk to return one considers reasonable.

Because of COMT's ability to handle nonlinear constraints, more elaborate models may be considered. For example, this model frequently concentrates the allocation into a minority of the securities. To spread out the allocation one could solve the problem subject to a maximum variance for the weights, i.e., subject to

where is a constant setting a ceiling on the sums of squares of the weights.

This data was taken from Harry S. Marmer and F.K. Louis Ng, "Mean-Semivariance Analysis of Option-Based Strategies: A Total Asset Mix Perspective", Financial Analysts Journal, May-June 1993.

An unconstrained analysis produced the results below:

It can be observed that the optimal portfolio weights are highly concentrated in T-bills.

Now let us constrain w ' w to be less than, say, .8. We then get:

The constraint does indeed spread out the weights across the categories, in particular stocks seem to receive more emphasis.

**Efficient portfolio for these analyses**

We see there that the constrained portfolio is riskier everywhere than the unconstrained portfolio given to a particular portfolio return.

In summary, COMT is well-suited for a variety of financial applications from the ordinary to the highly sophisticated, and the speed of GAUSS makes large and time-consuming problems feasible.

COMT is an advanced GAUSS Application and comes as GAUSS source code.

GAUSS Applications are modules written in GAUSS for performing specific modeling and analysis tasks. They are designed to minimize or eliminate the need for user programming while maintaining the flexibility for non-standard problems.

**Platform:** Windows, Mac, and Linux.

**Requirements:** GAUSS/GAUSS Light version 10 or higher; Linux requires version 10.0.4 or higher.