Global minimization of a non-expensive scalar function
- Driver:
- Download script:
The target of the study is to minimize a scalar function. The scalar function is assumed to be inexpensive to evaluate (i.e. evaluation time shorter than a second) and to have no known derivatives. In this case a heuristic global optimization is advisable. Differential evolution (DE) is an evolutionary optimization algorithm inspired by the mutation, crossover and selection processes occurring in nature.
As an example, the 2D Rastrigin function on a circular domain is minimized,
\[ \begin{align}\begin{aligned}&\text{min.}\,& f(x_1,x_2) = 2\cdot10 + \sum_{i=1,2} \left(x_i^2 - 10\cos(2\pi x_i)\right)\\&\text{s.t.}\,& \sqrt{x_1^2 + x_2^2} \leq 1.5.\end{aligned}\end{align} \]
1import sys,os
2import numpy as np
3import time
4
5from jcmoptimizer import Server, Client, Study, Obseravtion
6server = Server()
7client = Client(host=server.host)
8
9
10# Definition of the search domain
11design_space = [
12 {'name': 'x1', 'type': 'continuous', 'domain': (-1.5,1.5)},
13 {'name': 'x2', 'type': 'continuous', 'domain': (-1.5,1.5)},
14]
15
16# Definition of fixed environment parameter
17environment = [
18 {'name': 'radius', 'type': 'fixed', 'domain': 1.5},
19]
20
21# Definition of a constraint on the search domain
22constraints = [
23 {'name': 'circle', 'expression': 'sqrt(x1^2 + x2^2) <= radius'}
24]
25
26# Creation of the study object with study_id 'differential_evolution'
27study = client.create_study(
28 design_space=design_space,
29 environment=environment,
30 constraints=constraints,
31 driver="DifferentialEvolution",
32 study_name="Global minimization of a non-expensive scalar function",
33 study_id="differential_evolution"
34)
35
36# Configure study parameters
37study.configure(max_iter=80, num_parallel=2)
38
39# Evaluation of the black-box function for specified design parameters
40def evaluate(study: Study, x1: float, x2: float, radius: float) -> Observation:
41
42 observation = study.new_observation()
43 observation.add(10*2
44 + (x1**2-10*np.cos(2*np.pi*x1))
45 + (x2**2-10*np.cos(2*np.pi*x2))
46 )
47 return observation
48
49# Run the minimization
50study.set_evaluator(evaluate)
51study.run()
52best = study.driver.best_sample
53print(f"Best sample at: x1={best['x1']:.3f}, x2={best['x2']:.3f}")