ScipyMinimizer
Contents
- Purpose:
The purpose of the driver.
- Tutorials:
Tutorials demonstrating the application of this driver.
- Driver Interface:
Driver-specific methods of the Python interface.
- Configuration:
Configuration of the driver.
Purpose
The driver minimizes scalar function of one or more variables
based on the scipy function scipy.optimize.minimize
.
It is suited for the local minimization of inexpensive functions.
In order to perform search for the global minimum, the driver
allows to run local minimizations starting from multiple initial points
in parallel. A global minimization with this driver is advisable if the gradient
of the inexpensive scalar function is known.
Available optimization methods are Nelder-Mead, L-BFGS-B, and SLSQP, which are
described in more detail in the reference
of scipy.optimize.minimize
.
Each method has its own advantages:
- Nelder-Mead:
A robust method that doe not require gradient information.
- L-BFGS-B:
A robust method, especially suited for problems with known gradient.
- SLSQP:
A derivative-based method especially suited for problems with known gradient and inequality constraints defined when creating the the study (see
create_study()
).
Alternative drivers for the global minimization of inexpensive functions are DifferentialEvolution and CMAES. The ScipyMinimizer is recommended if:
Gradient information are available.
The objective function has a small number of well-separated local minima. In this case the start from multiple initial points has a good chance to converge to find all local minima including the global minimum.
For expensive objectives with evaluation times larger than a few seconds, the ActiveLearning driver is recommended.
Tutorials
Driver Interface
The driver instance can be obtained by Study.driver
.
- class jcmoptimizer.ScipyMinimizer(host, study_id, session)[source]
This class provides methods for retieving information of the result of the minimization using
scipy.optimize.minimize
.- property best_sample: dict[str, float | int | str]
Best sample with minimal objective value found during the minimization. Example:
for key, value in driver.best_sample.items(): print(f"{key} = {value}")
- describe()
Get description of all modules and their parameters that are used by the driver. Example:
description = driver.describe() print(description["members"]["surrogates"]["0"])
- Return type:
dict
[str
,Any
]
- Returns: A nested dictionary with description of submodules consisting
of a name and a descriptive text. If the entry describes a module, it has an additional
"members"
entry with dictionaries describing submodules and parameters.
- get_state(path=None)
Get state of the driver. Example:
best_sample = driver.get_state(path="best_sample")
- Parameters:
path (
Optional
[str
]) – A dot-separated path to a submodule or parameter. If none, the full state is returned.- Return type:
dict
[str
,Any
]
Returns: If path is None, a dictionary with information of driver state.
Note
A description of the meaning of each entry in the state can be retrieved by
describe()
.
- historic_parameter_values(path)
Get the values of an internal parameter for each iteration of the study. Example:
min_objective_values = driver.historic_parameter_values( path="acquisition_function.min_objective")
- Parameters:
path (
str
) – A dot-separated path to the parameter.- Return type:
list
[Any
]
Note
A description of the meaning of each parameter can be retrieved by
describe()
.
- property min_objective: float
Minimal objective value found during the minimization. Example:
min_objective = driver.min_objective
Configuration
The configuration parameters can be set by calling, e.g.
study.configure(example_parameter1 = [1,2,3], example_parameter2 = True)
The driver allows to run num_initial minimizations starting from multiple initial samples in parallel. Only continous design parameters are minimized while discrete and categorial parameters are fixed to the values of the initial samples.
max_iter (int)
Maximum number of evaluations of the studied system.
Default: Infinite number of evaluations.
max_time (float)
Maximum run time of study in seconds. The time is counted from the moment, that the parameter is set or reset.
Default:
inf
num_parallel (int)
Number of parallel evaluations of the studied system.
Default:
1
min_val (float)
The minimization is stopped when the observed objective value is below the specified minimum value.
Default:
-inf
min_step (float)
The minimization is stopped when the Eucledian distance between consecutive sampling points in the design parameter space is below the specified value.
Default:
0.0
num_initial (int)
Number of independent initial optimizers.
Default:
1
max_num_optimizers (int)
If an optimizer has converged, it is restarted at another position. If
max_num_optimizers
have converged, the optimization is stopped.Default: Infinite number of optimizers.
initial_samples (list[list])
List of initial samples taken as starting points. If
num_initial > len(initial_samples)
the rest of the starting points is chosen randomly.Default:
[]
sobol_sequence (bool)
If true, all initial samples are taken from a Sobol sequence. This typically improves the coverage of the parameter space.
Default:
True
method (str)
The name of the optimization method.
Default:
'L-BFGS-B'
Choices:'Nelder-Mead'
,'L-BFGS-B'
,'SLSQP'
.
jac (bool)
If true, the gradient is used for optimization. This option is ignored for Nelder-Mead that does not use gradient information.
Default:
False
Example
If set to true, the full gradient must be added to the observations. That is, for each continuous parameter one has to call:
observation.add(value=deriv_value, derivative='param_name')
step_size (float)
Step size used for numerical approximation of the gradient. This is used by the gradient-based methods L-BFGS-B and SLSQP if no gradient information is available (
jac=False
).Default:
0.0001
ftol (float)
Precision goal for the value of the objective function in the stopping criterion.
Default:
2.2e-09
simplex_size (float)
Size of the initial simplex in proportion to the domain size. Applies only to the Nelder-Mead method (
method='Nelder-Mead'
).Default:
0.1