Gradient-based minimization of a non-expensive scalar function
- Driver:
Download script: scipy_minimization.m
The target of the study is to minimize a scalar function. The scalar function is assumed to be inexpensive to evaluate (i.e. evaluation time shorter than a second) and to have known derivatives. In this case a global optimization can be performed by a set of gradient-based local optimizations starting at different initial points. We start independent minimizations from six initial points (num_initial=6) and allow for two parallel evaluations of the objective function (num_parallel=2).
As an example, the 2D Rastrigin function on a circular domain is minimized,
\[ \begin{align}\begin{aligned}&\text{min.}\,& f(x_1,x_2) = 2\cdot10 + \sum_{i=1,2} \left(x_i^2 - 10\cos(2\pi x_i)\right)\\&\text{s.t.}\,& \sqrt{x_1^2 + x_2^2} \leq 1.5.\end{aligned}\end{align} \]
1server = jcmoptimizer.Server();
2client = jcmoptimizer.Client('host', server.host);
3
4% Definition of the search domain
5design_space = { ...
6 struct('name', 'x1', 'type', 'continuous', 'domain', [-1.5,1.5]), ...
7 struct('name', 'x2', 'type', 'continuous', 'domain', [-1.5,1.5]) ...
8};
9
10% Definition of fixed environment parameter
11environment = {...
12 struct('name', 'radius', 'type', 'fixed', 'domain', 1.5) ...
13};
14
15% Definition of a constraint on the search domain
16constraints = {...
17 struct('name', 'circle', 'expression', 'sqrt(x1^2 + x2^2) <= radius')...
18};
19
20 % Creation of the study object with study_id 'scipy_minimization'
21study = client.create_study( ...
22 'design_space', design_space, ...
23 'environment', environment, ...
24 'constraints', constraints,...
25 'driver','ScipyMinimizer',...
26 'study_name','Gradient-based minimization of a non-expensive scalar function',...
27 'study_id', 'scipy_minimization');
28
29study.configure('max_iter', 30, 'num_initial', 6, 'jac', true, 'method', 'SLSQP');
30
31% Evaluation of the black-box function for specified design parameters
32function observation = evaluate(study, sample)
33
34 observation = study.new_observation();
35 observation.add(10*2 ...
36 + (sample.x1^2 - 10*cos(2*pi*sample.x1)) ...
37 + (sample.x2^2 - 10*cos(2*pi*sample.x2)) ...
38 );
39 observation.add(2*sample.x1 + 2*pi*sin(2*pi*sample.x1), ...
40 'derivative', 'x1');
41 observation.add(2*sample.x2 + 2*pi*sin(2*pi*sample.x2), ...
42 'derivative', 'x2');
43
44end
45
46% Run the minimization
47study.set_evaluator(@evaluate);
48study.run();
49
50best = study.driver.best_sample;
51fprintf('Best sample at x1=%0.3e, x2=%0.3e\n', best.x1, best.x2)