Global minimization of a non-expensive scalar function with discrete parameters
The target of the study is to minimize a scalar function. The scalar function is assumed to be inexpensive to evaluate (i.e. evaluation time shorter than a second) and to have no known derivatives. In this case a heuristic global optimization is advisable. CMA-ES is an evolutionary optimization algorithm that draws population members from a multivariate Gaussian distribution.
As an example, a 3D Rastrigin-like function on a mixed circular and discrete domain is minimized,
\[ \begin{align}\begin{aligned}&\text{min.}\,& f(x_1,x_2,x_3) = 3\cdot10 + \sum_{i=1}^3 \left(x_i^2 - 10\cos(2\pi x_i)\right)\\&\text{s.t.}\,& \sqrt{x_1^2 + x_2^2} \leq 1.5 \wedge x_3 \in \{-1,0,1\}.\end{aligned}\end{align} \]
1import sys,os
2import numpy as np
3import time
4
5from jcmoptimizer import Server, Client, Study, Obseravtion
6server = Server()
7client = Client(host=server.host)
8
9
10# Definition of the search domain
11design_space = [
12 {'name': 'x1', 'type': 'continuous', 'domain': (-1.5,1.5)},
13 {'name': 'x2', 'type': 'continuous', 'domain': (-1.5,1.5)},
14 {'name': 'x3', 'type': 'discrete', 'domain': (1.0,0.0,1.0)},
15]
16
17# Definition of fixed environment parameter
18environment = [
19 {'name': 'radius', 'type': 'fixed', 'domain': 1.5},
20]
21
22# Definition of a constraint on the search domain
23constraints = [
24 {'name': 'circle', 'expression': 'sqrt(x1^2 + x2^2) <= radius'}
25]
26
27# Creation of the study object with study_id 'cma_es'
28study = client.create_study(
29 design_space=design_space,
30 environment=environment,
31 constraints=constraints,
32 driver="CMAES",
33 study_name="Global minimization of a non-expensive scalar function with discrete parameters",
34 study_id="cma_es"
35)
36# Configure study parameters
37study.configure(max_iter=80, num_parallel=2)
38
39# Evaluation of the black-box function for specified design parameters
40def evaluate(study: Study, x1: float, x2: float, x3: float, radius: float) -> Observation:
41
42 observation = study.new_observation()
43 observation.add(10*3
44 + (x1**2-10*np.cos(2*np.pi*x1))
45 + (x2**2-10*np.cos(2*np.pi*x2))
46 + (x3**2-10*np.cos(2*np.pi*x3))
47 )return observation
48
49# Run the minimization
50study.set_evaluator(evaluate)
51study.run()
52best = study.driver.best_sample
53print(f"Best sample at: x1={best['x1']:.3f}, x2={best['x2']:.3f}, x3={best['x3']:.1f}")