Same results with equal optimization procedure

0

Hello,

I am running a minimization procedure using optmum with a ML function in the context of a Kalman Filter model. When I get the outputs I use them to get the covariance matrix by inverting the hessian. Then I take square roots and get the standard errors.

The problem is that if I run the code several times I get different values for the standard errors. This is a bit confusing

I copy parts of my code, may be they help

1. the optimization call:

```_opgtol=0.00001;
{xout,fout,gout,cout}=optmum(&lik_fcn,PRMTR_IN);
```

2. the inversion of the hessian:

```COV=invpd(_opfhess);
```

3. standard error:

```SD_fnl =sqrt(diag(COV));
```

0

What is the condition number of the Hessian?  d = log(cond(_opfhess)) is an approximate measure of the number of digits lost in computing an inverse.  If the Hessian in _opfhess is computed numerically, 8 digits are lost computing the Hessian.  If d is greater than 8, you have lost all accuracy in the standard errors.   It also indicates a catastrophic loss of precision in the calculations and under that condition the Hessian calculation can vary with very small differences in rounding errors.

0

I checked the condition of the Hessian and is 4.4.

But in any case, I don't think the rounding problem will be the cause for the standard errors of all the coefficients to change and even be all equal in some cases.

May be in my previous post I was not very clear, but the thing is I am running several times the same code for the same data and I get different values for the standard errors of the estimated coefficients.

I believe it is a data problem. When I omit the first 16 iterations in my Kalman Filter loop this problem disappears.

However, the problem I have now is that for different data vectors (different countries but the same variable) the code does not converge.

Could you may be give me an idea which underlying problems may be there when convergence is not achieve?

Sorry if my question is too general.

0

What is happening is that when the main optimization methods in Optmum get stuck, a random search is used. The random search is what is causing differences in the standard errors. If you set the random number seed with the rndseed command, that will give the same answer from run to run.

aptech
615

• Aptech Systems, Inc. Worldwide Headquarters

Aptech Systems, Inc.
2350 East Germann Road, Suite #21
Chandler, AZ 85286

Phone: 360.886.7100
FAX: 360.886.8922

• Training & Events

Want more guidance while learning about the full functionality of GAUSS and its capabilities? Get in touch for in-person training or browse additional references below.

• Tutorials

Step-by-step, informative lessons for those who want to dive into GAUSS and achieve their goals, fast.

• Have a Specific Question?

Get a real answer from a real person

• Need Support?
• Support Plans

Premier Support and Platinum Premier Support are annually renewable membership programs that provide you with important benefits including technical support, product maintenance, and substantial cost-saving features for your GAUSS System or the GAUSS Engine.

• User Forums

Join our community to see why our users are considered some of the most active and helpful in the industry!