Hi ,

Does anyone know, what backsteps exactly mean when you use "Maxlik" for optimization? Consider the following example:

iteration: 49

algorithm: BFGS step method: STEPBT

function: 7.00563 step length: 0.00995 backsteps: 23

Does it mean the optimization procedure cannot find the path to minimum? I assume it means the procedure has tried 23 times and has not been able to find a path that reduces the maximum likelihood. So, my question is what exactly happens before each backstep? Does is select a random direction and tries to find a step length that reduces the function?

Thank you in advance.

## 1 Answer

0

accepted

It doesn't mean a failure to converge, but it does suggest that it is having trouble seeking it. At each iteration, after a new set of parameter values are computed (called a "direction"), a line search is conducted where a constant is found that moves the objective function closer to convergence. The number of backsteps are, as you suspect, the number of attempts in the line search. The maximum number of attempts is 100. If the maximum is reached it will continue the optimization without the line search, and usually will be able to recover. But if the iterations consistently require a large number of attempts in the line search then there may be some sort of problem with the optimization. 90% of the time this can be resolved by scaling the data. There also may be some failures of precision in the function calculation. It is important that the function avoids producing very large numbers along with very small numbers.

## Your Answer

## 1 Answer

It doesn't mean a failure to converge, but it does suggest that it is having trouble seeking it. At each iteration, after a new set of parameter values are computed (called a "direction"), a line search is conducted where a constant is found that moves the objective function closer to convergence. The number of backsteps are, as you suspect, the number of attempts in the line search. The maximum number of attempts is 100. If the maximum is reached it will continue the optimization without the line search, and usually will be able to recover. But if the iterations consistently require a large number of attempts in the line search then there may be some sort of problem with the optimization. 90% of the time this can be resolved by scaling the data. There also may be some failures of precision in the function calculation. It is important that the function avoids producing very large numbers along with very small numbers.