# BRENT and STEPBT in Optimization 3.1

I want to be very clear about these algorithms in a paper I'm writing. In the documentation BRENT is said to be a variation of a golden section method described in Brent (1972). I have the 2002 reissue. Are we talking about Chapter 5.4? The algorithm described is a mixture of golden section and parabolic steps. Is that the one? I wish I remembered some ALGOL which I studied (poorly) in 1964 or 1965.

STEPBT references Dennis and Schnabel. "BT" stands for backtrack I am inferring from the module called LINESEARCH in Algorithm A6.3.1. Is that it?

I am working on ML estimation of the Generalized Beta 2 distribution. It has only 4 parameters but the ML problem is made difficult by ill-conditioning when p is about equal to q. Algorithm choice turns out very important.

Thanks,

Carter Hill-

0

You could change `B(p,q)` to `B(q+sqrt((d + delta)^2),q)` redefining `p = q + d`, and in your function set where `delta` is fixed to some small number, 1e-4, say. This is a differentiable way of constraining `p - q` to be greater than that small number.

Alternatively SQPsolvemt or CMLmt could be used to constrain `p - q` to be greater than a small number.

I am not sure about the algorithm. I will see who I can ask about that.

aptech

1,773

0

You could change `B(p,q)` to `B(q+sqrt((d + delta)^2),q)` redefining `p = q + d`, and in your function set where `delta` is fixed to some small number, 1e-4, say. This is a differentiable way of constraining `p - q` to be greater than that small number.

Alternatively SQPsolvemt or CMLmt could be used to constrain `p - q` to be greater than a small number.

I am not sure about the algorithm. I will see who I can ask about that.

### Have a Specific Question?

Get a real answer from a real person

### Need Support?

Get help from our friendly experts.

### Try GAUSS for 14 days for FREE

See what GAUSS can do for your data