When using the `cmlmt`

library to estimate a model which is very complex and has many parameters, I get a warning message that `The covariance of the parameters failed to invert`

.

How do I avoid this type of error?

## 1 Answer

0

It indicates a linear dependency in the Hessian to be inverted for the covariance matrix of the parameters. It could be a scaling issue in which case the Hessian is poorly calculated and might even be negative definite. The solution there is to scale the data.

It also could be a failure of information being available in the data for one or more parameters. In regression models, it emerges as multicollinearity.

First, retrieve the computed Hessian:

`h = out.hessian;`

Then compute the eigenvalues:

`print eigh(h);`

If there any negative eigenvalues, the Hessian is negative definite and is being poorly calculated. The only solution here is to scale the data or improve the calculations of the function in the log-likelihood procedure.

If the smallest eigenvalue is zero, two or more parameters are linear functions of each other either because of the data or the model. A method for determining this can be found in the last section of this paper. First, obtain the QRE factorization of the Hessian:

`{ R, E } = qre(h);`

Inspect the diagonal of `R`

. The dependencies will be pivoted to the end, and the number of diagonal entries equal to or very closely equal to zero is the number of linear dependencies. Suppose it is 1 and that the number of parameters, i.e., the number of rows in `R`

is `k`

. Then next compute

`B = inv(R[1:k-1,1:k-1]) * R[1:k-1,k];`

The `k-1`

by 1 matrix `B`

describes the relationship of the linearly dependent parameter with the remaining parameters. The `E`

matrix is a list of the row numbers in `R`

, and `B`

, of the parameters. The column of `B`

is associated with the final element of `E`

, and the rows of `B`

are associated with the 1 through `k-1`

elements of `E`

.

Thus suppose the final element of `E`

is 5, say, and that the `B`

vector has one nonzero element in the third row, and that the third element of `E`

is 7, say, then we conclude that parameter 5 is linearly related to parameter 7. These two parameters cannot be estimated at the same time.

There are two basic tactics to solve this problem, first, you might delete one of these parameters from the model or at least fix one of them to some value, or second, you could impose some constraint on them, equality perhaps, i.e., use the equality constraint to force them equal to each other. In this way, you are estimating one parameter rather than two.

## Your Answer

## 1 Answer

It indicates a linear dependency in the Hessian to be inverted for the covariance matrix of the parameters. It could be a scaling issue in which case the Hessian is poorly calculated and might even be negative definite. The solution there is to scale the data.

It also could be a failure of information being available in the data for one or more parameters. In regression models, it emerges as multicollinearity.

First, retrieve the computed Hessian:

`h = out.hessian;`

Then compute the eigenvalues:

`print eigh(h);`

If there any negative eigenvalues, the Hessian is negative definite and is being poorly calculated. The only solution here is to scale the data or improve the calculations of the function in the log-likelihood procedure.

If the smallest eigenvalue is zero, two or more parameters are linear functions of each other either because of the data or the model. A method for determining this can be found in the last section of this paper. First, obtain the QRE factorization of the Hessian:

`{ R, E } = qre(h);`

Inspect the diagonal of `R`

. The dependencies will be pivoted to the end, and the number of diagonal entries equal to or very closely equal to zero is the number of linear dependencies. Suppose it is 1 and that the number of parameters, i.e., the number of rows in `R`

is `k`

. Then next compute

`B = inv(R[1:k-1,1:k-1]) * R[1:k-1,k];`

The `k-1`

by 1 matrix `B`

describes the relationship of the linearly dependent parameter with the remaining parameters. The `E`

matrix is a list of the row numbers in `R`

, and `B`

, of the parameters. The column of `B`

is associated with the final element of `E`

, and the rows of `B`

are associated with the 1 through `k-1`

elements of `E`

.

Thus suppose the final element of `E`

is 5, say, and that the `B`

vector has one nonzero element in the third row, and that the third element of `E`

is 7, say, then we conclude that parameter 5 is linearly related to parameter 7. These two parameters cannot be estimated at the same time.

There are two basic tactics to solve this problem, first, you might delete one of these parameters from the model or at least fix one of them to some value, or second, you could impose some constraint on them, equality perhaps, i.e., use the equality constraint to force them equal to each other. In this way, you are estimating one parameter rather than two.