function J = GetJ(a,b,x) randn('seed',a); phi_a=abs(sum(randn(1,1e6))); randn('seed',b); phi_b=abs(sum(randn(1,1e6))); J=-(phi_a*x^2+phi_b*sqrt(1-x^2)); end
a=121233; b=31235; fhnd_GetJ = @(x) GetJ(a,b,x);
fhnd_GetJ
variable contains an anonymous function handle that takes one parameter and allows you to calculate GetJ(a,b, )
for the values of a and b specified when creating this anonymous function. You can go directly to the minimization. Let's say we want to find the optimal value of x with an accuracy of 1 millionth: opt=optimset('TolX',1e-6); optimal_x=fminbnd(fhnd_GetJ,0,1,opt);
fminbnd(fun,x_min,x_max)
minimizes the scalar function of the scalar argument on the interval [ x_min ; x_max ]. Here fun
is the handle of the function being optimized. When performing the optimization, the GetJ
function GetJ
called 12 times; this can be seen by setting the 'Display'
parameter in the options as 'iter'
- it will show all iterations. On my computer, optimization takes, on average, 10 ms.GetJ
function GetJ
we are completely wasting the values of phi_a
and phi_b
, since they do not change with the variation of x and depend only on the given values of a and b . How can you save? The most frequent variant that occurs to me is to make preliminary calculations from the objective function. Do this function function J = GetJ2(phi_a,phi_b,x) J=-(phi_a*x^2+phi_b*sqrt(1-x^2)); end
randn('seed',a); phi_a=abs(sum(randn(1,1e6))); randn('seed',b); phi_b=abs(sum(randn(1,1e6))); fhnd_GetJ2 = @(x) GetJ2(phi_a,phi_b,x); optimal_x=fminbnd(fhnd_GetJ2,0,1,opt);
phi_a
- the calculation of phi_a
and phi_b
has no independent value and is not necessary apart from the calculation of J. If somewhere in another place again you need to consider J (a, b, x) , no longer for optimization, but just like that, then instead of just calling GetJ
, you also have to drag the calculation of phi_a
and phi_b
or do functions for optimization separately, separately for calculations. Well, just not very beautiful.GetJ
function: function J = GetJ3(a,b,x) randn('seed',a); phi_a=abs(sum(randn(1,1e6))); randn('seed',b); phi_b=abs(sum(randn(1,1e6))); J=nf_GetJ(x); function out_val = nf_GetJ(x) out_val=-(phi_a*x^2+phi_b*sqrt(1-x^2)); end end
nf_GetJ
sees all the variables in the scope of the parent function and, in principle, it is clear what and how it does. So far, we have not received any gain in speed - all the same 10 ms for optimization.GetJ
function can return a handle to its own nested function. And when calling a function on this handle, the parent function will not be executed, but its scope with the parameters counted will remain! We write: function fhnd_J = GetJ4(a,b,x) randn('seed',a); phi_a=abs(sum(randn(1,1e6))); randn('seed',b); phi_b=abs(sum(randn(1,1e6))); fhnd_J=@(x) nf_GetJ(x); function out_val = nf_GetJ(x) out_val=-(phi_a*x^2+phi_b*sqrt(1-x^2)); end end
fhnd_J
, which allows to calculate the J value for the used parameters a and b , without calculating the phi_a
and phi_b
, but using the values calculated when creating the handle. Further, our optimization looks like this: fhnd_GetJ4=GetJ4(a,b,x); optimal_x=fminbnd(fhnd_GetJ4,0,1,opt);
GetJ2
, but preserved the integrity of the function and the convenience of its use.Source: https://habr.com/ru/post/199300/
All Articles