$$\alpha \beta \gamma \delta \epsilon \varepsilon \pi \Pi \sigma \Sigma$$

Lecture plan

Parallel computing basics

Optimization

? Optimization toolbox => Tutorials => Solver based:
https://se.mathworks.com/help/releases/R2019a/pdf_doc/optim/optim_tb.pdf USER's GUIDE
 
Choose an optimization solver.
Create an objective function, typically the function you want to minimize.
Create constraints, if any.
Set options, or use the default options.
Call the appropriate solver.
For a basic nonlinear optimization example, see Solve a Constrained Nonlinear Problem.

Obj function

Banana-function of Rosenbrock: $$r(x,y)=100(y -x^2)^2 + (1-x)^2$$ rosenbrock=@(x)100*(x(:,2) - x(:,1).^2).^2 + (1 - x(:,1)).^2;

Constraints: $c(x) \leq 0$ or $ceq(x)=0$.

This example includes only an inequality constraint, so you must pass an empty array [] as the equality constraint function ceq.

function [c,ceq] = unitdisk(x)
c = x(1)^2 + x(2)^2 - 1;
ceq = [ ];
Note: Rosenbrock's function is a standard test function in optimization. It has a unique minimum value of 0 attained at the point [1,1]. Finding the minimum is a challenge for some algorithms because the function has a shallow minimum inside a deeply curved valley. The solution for this problem is not at the point [1,1] because that point does not satisfy the constraint.

Run the optimization

  1. Optimiz app (not for us)
  2. Comand line (yes)

Minimize Rosenbrock's Function at the Command Line

  1. Create options that choose iterative display and the interior-point algorithm
         options = optimoptions(@fmincon,...
        'Display','iter','Algorithm','interior-point');
        
  2. Run the fmincon solver with the options structure, reporting both the location x of the minimizer and the value fval attained by the objective function.
        [x,fval] = fmincon(rosenbrock,[0 0],...
        [],[],[],[],[],[],@unitdisk,options)
        
    The six sets of empty brackets represent optional constraints that are not being used in this example. See the fmincon function reference pages for the syntax.

    MATLAB outputs a table of iterations and the results of the optimization.
    Local minimum found that satisfies the constraints.
    Optimization completed because the objective function is non-decreasing in feasible directions, to within the selected value of the function tolerance, and constraints are satisfied to within the selected value of the constraint tolerance.

    x =
        0.7864    0.6177
    fval =
        0.0457
    

Tutorial for the Optimization Toolbox

? Optimization toolbox => Tutorials => Solver based
Open live script

This example shows how to use two nonlinear optimization solvers and how to set options. The nonlinear solvers that we use in this example are fminunc and fmincon. (Much the same as above with Rosenbrock, but there's more, too.)

All the principles outlined in this example apply to the other nonlinear solvers, such as fgoalattain, fminimax, lsqnonlin, lsqcurvefit, and fsolve.

Unconstrained Optimization Example

Minimize: $$ x e^{-(x^2 + y^2)} + (x^2+y^2)/20 $$
   f = @(x,y) x.*exp(-x.^2-y.^2)+(x.^2+y.^2)/20;
   fsurf(f,[-2,2],'ShowContours','on')
  
Function def. for optimization:
 
   fun = @(x) f(x(1),x(2));
   x0 = [-.5; 0];
   options = optimoptions('fminunc','Algorithm','quasi-newton');
   options.Display = 'iter';
   [x, fval, exitflag, output] = fminunc(fun,x0,options);
  
>> syms x y
>> f(x,y) 
ans = 
x*exp(- x^2 - y^2) + x^2/20 + y^2/20 
>> gradient(f(x,y),[x,y]) 
ans = 
 x/10 + exp(- x^2 - y^2) - 2*x^2*exp(- x^2 - y^2)
                    y/10 - 2*x*y*exp(- x^2 - y^2)
  

Parallel tutorial taskfunctions

SKIP THIS: (not in 2019 anymore) myparalleltutorialtaskfunctions.m

%% Writing Task Functions % In this example, we look at two common cases when we might want to write a % wrapper function for the Parallel Computing Toolbox(TM). Those wrapper % functions will be our task functions and will allow us to use the toolbox in % an efficient manner. The particular cases are: % % * We want one task to consist of calling a nonvectorized function % multiple times. % * We want to reduce the amount of data returned by a task

Minimizing an Expensive Optimization Problem Using Parallel CT