Gradient-based minimization of a non-expensive scalar function
- Driver:
- Download script:
The target of the study is to minimize a scalar function. The scalar function is assumed to be inexpensive to evaluate (i.e. evaluation time shorter than a second) and to have known derivatives. In this case a global optimization can be performed by a set of gradient-based local optimizations starting at different initial points. We start independent minimizations from six initial points (num_initial=6) and allow for two parallel evaluations of the objective function (num_parallel=2).
As an example, the 2D Rastrigin function on a circular domain is minimized,
\[ \begin{align}\begin{aligned}&\text{min.}\,& f(x_1,x_2) = 2\cdot10 + \sum_{i=1,2} \left(x_i^2 - 10\cos(2\pi x_i)\right)\\&\text{s.t.}\,& \sqrt{x_1^2 + x_2^2} \leq 1.5.\end{aligned}\end{align} \]
1import sys,os
2import numpy as np
3import time
4
5from jcmoptimizer import Server, Client, Study, Obseravtion
6server = Server()
7client = Client(host=server.host)
8
9
10# Definition of the search domain
11design_space = [
12 {'name': 'x1', 'type': 'continuous', 'domain': (-1.5,1.5)},
13 {'name': 'x2', 'type': 'continuous', 'domain': (-1.5,1.5)},
14]
15
16# Definition of fixed environment parameter
17environment = [
18 {'name': 'radius', 'type': 'fixed', 'domain': 1.5},
19]
20
21# Definition of a constraint on the search domain
22constraints = [
23 {'name': 'circle', 'expression': 'sqrt(x1^2 + x2^2) <= radius'}
24]
25
26# Creation of the study object with study_id 'scipy_minimization'
27study = client.create_study(
28 design_space=design_space,
29 environment=environment,
30 constraints=constraints,
31 driver="ScipyMinimizer",
32 study_name="Gradient-based minimization of a non-expensive scalar function",
33 study_id="scipy_minimization"
34)
35
36# Configure study parameters
37study.configure(max_iter=80, num_parallel=2, num_initial=6, jac=True, method="SLSQP")
38
39# Evaluation of the black-box function for specified design parameters
40def evaluate(study: Study, x1: float, x2: float, radius: float) -> Observation:
41
42 observation = study.new_observation()
43 observation.add(10*2
44 + (x1**2-10*np.cos(2*np.pi*x1))
45 + (x2**2-10*np.cos(2*np.pi*x2))
46 )
47 observation.add(2*x1+2*np.pi*np.sin(2*np.pi*x1),
48 derivative="x1")
49 observation.add(2*x2+2*np.pi*np.sin(2*np.pi*x2),
50 derivative="x2")
51 return observation
52
53# Run the minimization
54study.set_evaluator(evaluate)
55study.run()
56best = study.driver.best_sample
57print(f"Best sample at: x1={best['x1']:.3f}, x2={best['x2']:.3f}")