Choose an optimization solver. Create an objective function, typically the function you want to minimize. Create constraints, if any. Set options, or use the default options. Call the appropriate solver.For a basic nonlinear optimization example, see Solve a Constrained Nonlinear Problem.
Constraints: $c(x) \leq 0$ or $ceq(x)=0$.
This example includes only an inequality constraint, so you must pass an empty array [] as the equality constraint function ceq.
function [c,ceq] = unitdisk(x) c = x(1)^2 + x(2)^2 - 1; ceq = [ ];Note: Rosenbrock's function is a standard test function in optimization. It has a unique minimum value of 0 attained at the point [1,1]. Finding the minimum is a challenge for some algorithms because the function has a shallow minimum inside a deeply curved valley. The solution for this problem is not at the point [1,1] because that point does not satisfy the constraint.
options = optimoptions(@fmincon,... 'Display','iter','Algorithm','interior-point');
[x,fval] = fmincon(rosenbrock,[0 0],... [],[],[],[],[],[],@unitdisk,options)The six sets of empty brackets represent optional constraints that are not being used in this example. See the fmincon function reference pages for the syntax.
MATLAB outputs a table of iterations and the results of the optimization.
Local minimum found that satisfies the constraints.
Optimization completed because the objective function is non-decreasing in
feasible directions, to within the selected value of the function tolerance,
and constraints are satisfied to within the selected value of the constraint tolerance.
x = 0.7864 0.6177 fval = 0.0457
This example shows how to use two nonlinear optimization solvers and how to set options. The nonlinear solvers that we use in this example are fminunc and fmincon. (Much the same as above with Rosenbrock, but there's more, too.)
All the principles outlined in this example apply to the other nonlinear solvers, such as fgoalattain, fminimax, lsqnonlin, lsqcurvefit, and fsolve.
f = @(x,y) x.*exp(-x.^2-y.^2)+(x.^2+y.^2)/20; fsurf(f,[-2,2],'ShowContours','on')Function def. for optimization:
fun = @(x) f(x(1),x(2)); x0 = [-.5; 0]; options = optimoptions('fminunc','Algorithm','quasi-newton'); options.Display = 'iter'; [x, fval, exitflag, output] = fminunc(fun,x0,options);
>> syms x y >> f(x,y) ans = x*exp(- x^2 - y^2) + x^2/20 + y^2/20 >> gradient(f(x,y),[x,y]) ans = x/10 + exp(- x^2 - y^2) - 2*x^2*exp(- x^2 - y^2) y/10 - 2*x*y*exp(- x^2 - y^2)