.. _ScipyMinimizer: ================================================ ScipyMinimizer ================================================ Contents ========= :`Purpose`_: The purpose of the driver. :`Tutorials`_: Tutorials demonstrating the application of this driver. :`Driver Interface`_: Driver-specific methods of the Python interface. :`Configuration`_: Configuration of the driver. Purpose ======= The driver minimizes scalar function of one or more variables based on the `scipy` function ``scipy.optimize.minimize``. It is suited for the local minimization of inexpensive functions. In order to perform search for the global minimum, the driver allows to run local minimizations starting from multiple initial points in parallel. A global minimization with this driver is advisable if the gradient of the inexpensive scalar function is known. Available optimization methods are Nelder-Mead, L-BFGS-B, and SLSQP, which are described in more detail in the `reference `_ of ``scipy.optimize.minimize`` . Each method has its own advantages: :Nelder-Mead: A robust method that doe not require gradient information. :L-BFGS-B: A robust method, especially suited for problems with known gradient. :SLSQP: A derivative-based method especially suited for problems with known gradient and inequality constraints defined when creating the the study (see :func:`~jcmoptimizer.Client.create_study`). Alternative drivers for the global minimization of **inexpensive** functions are :ref:`DifferentialEvolution` and :ref:`CMAES`. The ScipyMinimizer is recommended if: * Gradient information are available. * The objective function has a small number of well-separated local minima. In this case the start from multiple initial points has a good chance to converge to find all local minima including the global minimum. For **expensive** objectives with evaluation times larger than a few seconds, the :ref:`ActiveLearning` driver is recommended. Tutorials ========= .. toctree:: ../tutorials/scipy_minimization ../tutorials/benchmark Driver Interface ================ The driver instance can be obtained by :attr:`.Study.driver`. .. currentmodule:: jcmoptimizer .. autoclass:: ScipyMinimizer :members: :inherited-members: Configuration ============= The configuration parameters can be set by calling, e.g. .. code-block:: python study.configure(example_parameter1 = [1,2,3], example_parameter2 = True) The driver allows to run :ref:`num_initial` minimizations starting from multiple :ref:`initial samples ` in parallel. Only continous design parameters are minimized while discrete and categorial parameters are fixed to the values of the initial samples. .. _ScipyMinimizer.max_iter: max_iter (int) """""""""""""" Maximum number of evaluations of the studied system. Default: Infinite number of evaluations. .. _ScipyMinimizer.max_time: max_time (float) """""""""""""""" Maximum run time of study in seconds. The time is counted from the moment, that the parameter is set or reset. Default: ``inf`` .. _ScipyMinimizer.num_parallel: num_parallel (int) """""""""""""""""" Number of parallel evaluations of the studied system. Default: ``1`` .. _ScipyMinimizer.min_val: min_val (float) """"""""""""""" The minimization is stopped when the observed objective value is below the specified minimum value. Default: ``-inf`` .. _ScipyMinimizer.min_step: min_step (float) """""""""""""""" The minimization is stopped when the Eucledian distance between consecutive sampling points in the design parameter space is below the specified value. Default: ``0.0`` .. _ScipyMinimizer.num_initial: num_initial (int) """"""""""""""""" Number of independent initial optimizers. Default: ``1`` .. _ScipyMinimizer.max_num_optimizers: max_num_optimizers (int) """""""""""""""""""""""" If an optimizer has converged, it is restarted at another position. If ``max_num_optimizers`` have converged, the optimization is stopped. Default: Infinite number of optimizers. .. _ScipyMinimizer.initial_samples: initial_samples (list[list]) """""""""""""""""""""""""""" List of initial samples each with dimension of the design space. The role of the initial samples is twofold. First, they are regarded as initial guess to the minimization. Once the ``num_initial`` populations have been initialized these samples replace the first (best) member. Second, if the design space contains discrete or categorial parameters, they can be specified for each DE minimizer by the value of the initial samples. If ``num_initial > len(initial_samples)`` the rest of the initial populations is chosen randomly. Default: ``[]`` .. _ScipyMinimizer.sobol_sequence: sobol_sequence (bool) """"""""""""""""""""" If true, all initial samples are taken from a Sobol sequence. This typically improves the coverage of the parameter space. Default: ``True`` .. _ScipyMinimizer.method: method (str) """""""""""" The name of the optimization method. Default: ``'L-BFGS-B'`` Choices: ``'Nelder-Mead'``, ``'L-BFGS-B'``, ``'SLSQP'``. .. _ScipyMinimizer.jac: jac (bool) """""""""" If true, the gradient is used for optimization. This option is ignored for Nelder-Mead that does not use gradient information. Default: ``False`` .. admonition:: Example If set to true, the full gradient must be added to the observations. That is, for each continuous parameter one has to call: .. code-block:: python observation.add(value=deriv_value, derivative='param_name') .. _ScipyMinimizer.step_size: step_size (float) """"""""""""""""" Step size used for numerical approximation of the gradient. This is used by the gradient-based methods L-BFGS-B and SLSQP if no gradient information is available (``jac=False``). Default: ``0.0001`` .. _ScipyMinimizer.ftol: ftol (float) """""""""""" Precision goal for the value of the objective function in the stopping criterion. Default: ``2.2e-09`` .. _ScipyMinimizer.simplex_size: simplex_size (float) """""""""""""""""""" Size of the initial simplex in proportion to the domain size. Applies only to the Nelder-Mead method (``method='Nelder-Mead'``). Default: ``0.1``