Gradient-based minimization of a non-expensive scalar function
- Driver:
- Download script:
The target of the study is to minimize a scalar function. The scalar function is assumed to be inexpensive to evaluate (i.e. evaluation time shorter than a second) and to have known derivatives. In this case a global optimization can be performed by a set of gradient-based local optimizations starting at different initial points. We start independent minimizations from six initial points (num_initial=6) and allow for two parallel evaluations of the objective function (num_parallel=2).
As an example, the 2D Rastrigin function on a circular domain is minimized,
\[ \begin{align}\begin{aligned}&\text{min.}\,& f(x_1,x_2) = 2\cdot10 + \sum_{i=1,2} \left(x_i^2 - 10\cos(2\pi x_i)\right)\\&\text{s.t.}\,& \sqrt{x_1^2 + x_2^2} \leq 1.5.\end{aligned}\end{align} \]
1import sys,os
2import numpy as np
3import time
4
5from jcmoptimizer import Client, Study, Obseravtion
6client = Client()
7
8
9# Definition of the search domain
10design_space = [
11 {'name': 'x1', 'type': 'continuous', 'domain': (-1.5,1.5)},
12 {'name': 'x2', 'type': 'continuous', 'domain': (-1.5,1.5)},
13]
14
15# Definition of fixed environment parameter
16environment = [
17 {'name': 'radius', 'type': 'fixed', 'domain': 1.5},
18]
19
20# Definition of a constraint on the search domain
21constraints = [
22 {'name': 'circle', 'expression': 'sqrt(x1^2 + x2^2) <= radius'}
23]
24
25# Creation of the study object with study_id 'scipy_minimization'
26study = client.create_study(
27 design_space=design_space,
28 environment=environment,
29 constraints=constraints,
30 driver="ScipyMinimizer",
31 study_name="Gradient-based minimization of a non-expensive scalar function",
32 study_id="scipy_minimization"
33)
34
35# Configure study parameters
36study.configure(max_iter=80, num_parallel=2, num_initial=6, jac=True, method="SLSQP")
37
38# Evaluation of the black-box function for specified design parameters
39def evaluate(study: Study, x1: float, x2: float, radius: float) -> Observation:
40
41 observation = study.new_observation()
42 observation.add(10*2
43 + (x1**2-10*np.cos(2*np.pi*x1))
44 + (x2**2-10*np.cos(2*np.pi*x2))
45 )
46 observation.add(2*x1+2*np.pi*np.sin(2*np.pi*x1),
47 derivative="x1")
48 observation.add(2*x2+2*np.pi*np.sin(2*np.pi*x2),
49 derivative="x2")
50 return observation
51
52# Run the minimization
53study.set_evaluator(evaluate)
54study.run()
55best = study.driver.best_sample
56print(f"Best sample at: x1={best['x1']:.3f}, x2={best['x2']:.3f}")
57
58client.shutdown_server()