Release Notes

3.0.2 [2025-01-29]

This is a patch release with optimizations and bug fixes.

Optimizations

  • Client(): The Client constructor can be used without server host or server ID. In this case, it is checked, whether a cloud server is running to connect to. If not, a new server is started automatically.
  • Client.create_study(): Two new arguments, extend_on_request and extend_on_suggestion are introduced that extend the runtime of a cloud-based server automatically on each new request and each new retrieved suggestion. This allows to select a shutdown time in the near future and to extend the server runtime only if required.
  • Configuration: The configuration file (~/.jcmoptimizer.conf.yml) allows do configure a default server runtime when stating a server via the Python or Matlab interface as well as defailt values for extend_on_request and extend_on_suggestion.
  • Surrogates: A check was added that neither design nor environment values contain NaNs.

Bug Fixes

  • Python interface: Due to the use of a modern union type annotation (state: str | None) the client was incompatible with Python v3.9. Now we use Union[] and Optional[] everywhere.
  • Matlab interface: It was not possible to start a local server using the Server() constructor. The error was fixed.
  • Neural Network Ensemble: Fixed an issue with the weight prior when using sinusoidal activations that prevented a good hyperparameter optimization.

3.0.0 [2025-12-27]

This is a major upgrade of JCMoptimizer that now supports a cloud-based operation. To support both a local and a cloud-based operation, the installation process has been extended. Please refer to the Python or Matlab installation guides.

New Features

Configuration

Cloud Interface

The cloud-based operation adds a cloud interface. Here, you can

  • Add/edit new projects. Each project can contain multiple studies.
  • Studies can be renamed, copied, moved, deleted, downloaded, uploaded, etc.
  • Add/delete API access tokens.
  • Start/shutdown cloud-based JCMoptimizer servers. However, this can be also done directly from the Python and Matlab clients.

Python/Matlab clients

  • The Server() constructor has been extended to support both a local and cloud-based operation (see the documentation of the Server interface).
  • The Client() constructor has been extended to support both a local and cloud-based operation (see the documentation of the Client interface).
  • When creating a new study, a project ID and project name can be specified with client.create_study(..., project_name='My project', project_id='project_01')
  • The study data can be cleared by calling client.create_study(..., clear_storage=True) with the clear_storage argument.
  • The python client jcmoptimizer is published on pypi and can be installed by running, e.g., pip install jcmoptimizer.

Minimal Required Update Steps

  • Clients: When creating a new study via client.create_study(...), the name argument must be changed to study_name. This renaming was done to avoid confusion with the new argument project_name.

Optimizations

  • Neural Network Ensemble: Rational activation functions have been integrated into the neural network ensemble.

Bug Fixes

  • study.add_many() lead to uninformative errors if the design space data points had the wrong dimension. Now an error message informing about the wrong dimensionality is shown.
  • Sobol' sampling in high dimensions lead to an error. Now random samples are generated as a fallback.

2.1.3 [2025-09-23]

This is a patch release with optimizations and bug fixes.

Optimizations

  • Neural Network Ensemble: Internal benchmarks show that the hyperparameter optimization of neural network ensembles is usually not beneficial for networks with three or more hidden layers and more than around 600 data points due to numerical instabilities. The user is now warned if a hyperparameter optimization is performed under these circumstances.

Bug Fixes

  • Guassian Process: If single-output or multi-output Gaussian processes were trained on constant data in one or more outputs, the prediction failed due to a normalization with respect to the range of the function values. This edge case is now treated explicitly avoiding prediction failures.
  • Matlab client: The startup of the server (server = jcmoptimizer.Server()) failed, when a warning about a soon expiring license is displayed by the server. This case is now handled correctly.

2.1.2 [2025-09-09]

This is a patch release with optimizations and bug fixes.

Optimizations

  • Installer: The installer uses the package manager uv to speed up the installation process.

Bug Fixes

  • Neural Network Ensemble: Due to a cyclic reference, the memory of neural network ensembles could not be completely released when a study was deleted. This could lead to a large increase of memory consumption over time. The cyclic reference was removed leading to significantly smaller memory footprints.

2.1.1 [2025-07-11]

This is a patch release with optimizations and bug fixes.

Optimizations

  • Study.add_many(): The time required to register many data points at once is significantly reduced by caching.
  • Dashboard: Version of bokeh package updated. This adds a context menu for interactions with figures.
  • Mean value: The property parameter of the mean function of a surrogate was given in units of the normalized and potentially warped training values. This property parameter is replaced by an the avg property parameter of the surrogate, which is not normalized or warped.

Bug Fixes

  • Hyperparameter optimization: Automatic hyperparameter optimization was run in the backround in an independent thread. This could lead in rare cases to racing conditions. Now the the optimization is run asyncronously in the main thread.
  • ActiveLearning Driver: In some cases the trained neural network was influenced by an update of the minimum and maximum of the training data, which is used to normalize the training data. This could lead to prediction errors. Now the minimum and maximum used for the normalization is frozen after training.

2.1.0 [2025-06-05]

This is a minor update release. It introduces features that enable the training of neural network surrogates to be more robust. Furthermore, the state of a surrogate (e.g. the trained weights of a network) is now saved between runs of a study.

New Features

  • Neural Network Ensemble: This release introduces hyperparameters that scale the weights of the neural network ensemble before training. For networks with one hidden layer, these hyperparameters correspond to the length scales of a Gaussian process. For networks with more than one hidden layer, these hyperparameters are still referred to as length scales, even though there is no longer a strict mathematical equivalence. Tuning the hyperparameters improves network training and prevents over- or underfitting to some extent. Furthermore, it is now often possible to use networks with one or two hidden layers to learn data that would otherwise require deeper neural networks. This also improves the training speed.
  • Surrogates: Training the surrogates for many data points is computationally demanding. If the study is restarted at a later point, the surrogates had to be retrained. For Gaussian processes, this involves Cholesky decomposition of the covariance matrix. For neural network ensembles, the weights need to be trained using gradient descent. To avoid this additional retraining step, the state of the surrogates is stored in a file between runs of the study. This file is stored under {save_dir}/{study_id}.jcmb.

Optimizations

  • Surrogates: In many cases, training multi-output surrogates has resulted in an ill-conditioned hyperparameter matrix for the covariance of the outputs. A new tolerance parameter, tolerance_hypercov, has been introduced to stabilise the hyperparameter covariance matrix by adding a small constant to its diagonal.
  • Neural Network Ensemble: To reduce training effort for derivative observations, the average of the neural network ensemble is trained to match observed gradients. This causes a loss of accuracy, because each ensemble member's gradient may differ from the observation. Setting train_mean_grad=False now enables each ensemble member to be trained to match the exact gradient. However, this will lead to much longer computation times for large ensembles.
  • Client-server communication: The maximum size of training data that can be transmitted to the server, for example using the study.add_many() function, has now been increased to 1024 MB. Previously, the limit was 100 MB.

Bug Fixes

  • ActiveLearning Driver: The state of the driver hyperparameters was not saved correctly after the user initiated hyperparameter optimisation using the study.driver.optimize_hyperparameters() function.
  • MCMC sampling: In some cases, out-of-domain samples were not assigned a negative infinite log likelihood because they were mapped back to the domain. This could affect the quality of the MCMC sampling.
  • Matlab interface: study.get_observation_data() did not work correctly for vectorial observations.

2.0.5 [2025-03-07]

This is a patch release that fixes some issues of the installer. The fix introduced in v2.0.2 to use certificate authority files from the default system store lead to problems for some environments.

2.0.4 [2025-02-28]

This is a patch release that fixes some issues of the ActiveLearning driver and the installer.

Optimizations

  • ActiveLearning driver: Fit variable introduces prior uncertainties of fit parameters for better numerical stability of the fit result.

Bug Fixes

  • ActiveLearning driver: LinearCombination variable was not working for a combination of single-input posteriors.
  • ActiveLearning driver: Variables that are not used in any objectives could have wrong number of Monte-Carlo samples.
  • ActiveLearning driver: study.driver.get_minima() was not working for 1D minimization problems.
  • The license check was not working for some JCMlicenseServer.cfg files.

2.0.3 [2025-01-30]

This is a patch release that fixes some issues of the ScipyMinimizer and the BayesianReconstruction driver.

Optimizations

  • The traceback of failed sample computations is always written to the log although a second try is performed.

Bug Fixes

  • ScipyMinimizer driver with method Nelder-Mead: The continuation of a previous study determined a new simplex at the latest sample. Now the simplex is reconstructed from the previous study run leading to a better continuation.
  • BayesianReconstruction driver: The sampling strategy was not taking prior information fully into account. This has been fixed.

2.0.2 [2025-01-15]

This is a patch release that fixes some issues with hyperparameter optimization, state retrieval and installation in environment with proxy servers.

Bug Fixes

  • Hyperperameter is a computationally expensive process. For very large datasets this could lead to the server beeing non-responsive for long times and appeared to be down. Now hyperparameter optimization hands over more often to the main thread of the server to be more resonsive while the optimization is running.
  • Fixed the behavior that a call to study.get_state(path) with a partial path (e.g. path='suggestion') failed.
  • The installer uses the certificate authority files from the default system store instead of the bundled ones.

2.0.1 [2024-12-31]

This patch release improves the compatibility of the installer and the installation for different environments.

Bug Fixes

  • Installer was not working if conda channels are set via a .condarc file. Now it forces to use conda-forge channel to install python packages.
  • Installer had issues with ssl-connections. Now it uses the more up-to-date requests package for downloading micromamba.
  • Libraries are comiled against glibc 2.29 to have better compatibility with older Linux distributions.

2.0.0 [2024-11-15]

This is the first release of the standalone JCMoptimizer. It is a major upgrade of the optimizer previously shipped with JCMsuite based one a new implementation. Scripts using the old JCMsuite optimizer have to be adapted.

Excerpt of New Features

  • New driver ActiveLearning. The driver defines a general active learning process that can be configured for many different purposes such as optimization, multi-objective optimization, integration or learning of the behavior of one or more expensive black box functions.
  • New driver BayesianReconstruction. This is an extension of the BayesianLeastSquares driver used to fit model parameters. The new driver allows to maximize the full posterior probability including an error model and a prior distribution for all parameters.
  • New surrogate model Neural Network Ensemble that uses an ensemble of neural networks to predict posterior distributions.
  • Surrogates can be trained on not only on design parameters but also on externally defined environment parameters.
  • The output of surrogates can be transformed by means of variables that can be also concatenated.
  • Besides a minimization objective, it is possible to define outcome constraint objectives that try to constrain the value of a variable within a one-sided or two-sided interval.

Minimal Required Update Steps

To create a client instance, use the interface of the new optimizer. For example, in Python use

from jcmoptimizer import Server, Client, Study, Observation
server = Server()
client = Client(server.host)

The arguments for creating a new Study instance have changed. For example, the domain argument is now called design_space in order to distinguish it from parameters belonging to the environment:

# Definition of the search domain
design_space = [
    {'name': 'x1', 'type': 'continuous', 'domain': (-1.5,1.5)}, 
    {'name': 'x2', 'type': 'continuous', 'domain': (-1.5,1.5)},
]

# Definition of fixed environment parameter
environment = [
    {'name': 'radius', 'type': 'fixed', 'domain': 1.5},
]

# Definition of a constraint on the search domain
constraints = [
    {'name': 'circle', 'expression': 'sqrt(x1^2 + x2^2) <= radius'}
]

# Creation of the study object with study_id 'vanilla_bayesian_optimization'
study = client.create_study(
    design_space=design_space,
    environment=environment,
    constraints=constraints,
    driver="BayesianOptimization",
    name="Standard Bayesian optimization",
    study_id="vanilla_bayesian_optimization"
)

The objective function is replaced by an evaluator function in order to clarify that the output does not need to be the objective value itself but also physical outputs that can be mapped to the final objective.

# Evaluation of the black-box function for specified design parameters
def evaluate(study: Study, x1: float, x2: float, radius: float) -> Observation:

    time.sleep(2) # make objective expensive
    observation = study.new_observation()
    observation.add(10*2
                + (x1**2-10*np.cos(2*np.pi*x1)) 
                + (x2**2-10*np.cos(2*np.pi*x2))
            )
    return observation

# Run the minimization
study.set_evaluator(evaluate)
study.run()
  • The method study.set_parameters() is replaced by the method study.configure().

  • Many methods of the Study instance are replaced by other methods with different arguments and output. For example

  • study.info(): The new method study.get_state() outputs the state of the Study as a nested dictionary.
  • study.driver_info(): The new method study.driver.get_state() outputs the state of the study driver as a nested dictionary.
  • study.get_data_table(): The new method study.get_observation_data() outputs all data of observations added to the study.
  • study.get_minima(): The method is now part of the driver interface, study.driver.get_mimima().
  • study.get_statistics(): The method is now part of the driver interface, study.driver.get_statistics().
  • study.optimize_hyperparameters(): The method is now part of the driver interface, study.driver.optimize_hyperparameter().
  • study.predict(): The method is now part of the driver interface, study.driver.predict().
  • study.run_mcmc(): The method is now part of the driver interface, study.driver.run_mcmc().

Please consult the documentation for the description of the arguments and outputs of the new methods.