Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make n_optmod option available in Python #161

Merged
merged 4 commits into from
May 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 34 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,45 @@ Rust toolbox for Efficient Global Optimization algorithms inspired from [SMT](ht
## The Python module

Thanks to the [PyO3 project](https://pyo3.rs), which makes Rust well suited for building Python extensions.
You can install the Python package using:

### Installation

```bash
pip install egobox
```

See the [tutorial notebooks](https://github.com/relf/egobox/tree/master/doc/README.md) for usage of the optimizer
and mixture of Gaussian processes surrogate model.
### Egor optimizer

```python
import numpy as np
import egobox as egx

# Objective function
def f_obj(x: np.ndarray) -> np.ndarray:
return (x - 3.5) * np.sin((x - 3.5) / (np.pi))

# Minimize f_opt in [0, 25]
res = egx.Egor(egx.to_specs([[0.0, 25.0]]), seed=42).minimize(f_obj, max_iters=20)
print(f"Optimization f={res.y_opt} at {res.x_opt}") # Optimization f=[-15.12510323] at [18.93525454]
```

### Gpx surrogate model

```python
import numpy as np
import egobox as egx

# Training
xtrain = np.array([[0.0, 1.0, 2.0, 3.0, 4.0]]).T
ytrain = np.array([[0.0, 1.0, 1.5, 0.9, 1.0]]).T
gpx = egx.Gpx.builder().fit(xtrain, ytrain)

# Prediction
xtest = np.linspace(0, 4, 20).reshape((-1, 1))
ytest = gpx.predict(xtest)
```

See the [tutorial notebooks](https://github.com/relf/egobox/tree/master/doc/README.md) and [examples folder](https://github.com/relf/egobox/tree/d9db0248199558f23d966796737d7ffa8f5de589/python/egobox/examples) for more information on the usage of the optimizer and mixture of Gaussian processes surrogate model.

## The Rust libraries

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[project]
name = "egobox"
classifiers = [
"Development Status :: 4 - Beta",
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Science/Research",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
Expand Down
5 changes: 3 additions & 2 deletions python/egobox/tests/test_egor.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
import time
import logging

logging.basicConfig(level=logging.DEBUG)
logging.basicConfig(level=logging.INFO)


def xsinx(x: np.ndarray) -> np.ndarray:
Expand Down Expand Up @@ -119,6 +119,7 @@ def test_g24(self):
cstr_tol=np.array([1e-3, 1e-3]),
n_cstr=n_cstr,
seed=42,
n_optmod=2,
n_doe=n_doe,
)
start = time.process_time()
Expand Down Expand Up @@ -179,4 +180,4 @@ def test_egor_service(self):


if __name__ == "__main__":
unittest.main()
unittest.main(defaultTest=["TestOptimizer.test_g24"], exit=False)
13 changes: 12 additions & 1 deletion src/egor.rs
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ pub(crate) fn to_specs(py: Python, xlimits: Vec<Vec<f64>>) -> PyResult<PyObject>
///
/// par_infill_strategy (ParInfillStrategy enum)
/// Parallel infill criteria (aka qEI) to get virtual next promising points in order to allow
/// q parallel evaluations of the function under optimization.
/// q parallel evaluations of the function under optimization (only used when q_points > 1)
/// Can be either ParInfillStrategy.KB (Kriging Believer),
/// ParInfillStrategy.KBLB (KB Lower Bound), ParInfillStrategy.KBUB (KB Upper Bound),
/// ParInfillStrategy.CLMIN (Constant Liar Minimum)
Expand All @@ -120,6 +120,12 @@ pub(crate) fn to_specs(py: Python, xlimits: Vec<Vec<f64>>) -> PyResult<PyObject>
/// 10-points addition (should say 'tentative addition' because addition may fail for some points
/// but it is counted anyway).
///
/// n_optmod (int >= 1)
/// Number of iterations between two surrogate models training (hypermarameters optimization)
/// otherwise previous hyperparameters are re-used. The default value is 1 meaning surrogates are
/// properly trained at each iteration. The value is used as a modulo of iteration number. For instance,
/// with a value of 3, after the first iteration surrogate are trained at iteration 3, 6, 9, etc.
///
/// target (float)
/// Known optimum used as stopping criterion.
///
Expand Down Expand Up @@ -148,6 +154,7 @@ pub(crate) struct Egor {
pub infill_optimizer: InfillOptimizer,
pub kpls_dim: Option<usize>,
pub n_clusters: Option<usize>,
pub n_optmod: usize,
pub target: f64,
pub outdir: Option<String>,
pub hot_start: bool,
Expand Down Expand Up @@ -184,6 +191,7 @@ impl Egor {
infill_optimizer = InfillOptimizer::Cobyla,
kpls_dim = None,
n_clusters = 1,
n_optmod = 1,
target = f64::NEG_INFINITY,
outdir = None,
hot_start = false,
Expand All @@ -206,6 +214,7 @@ impl Egor {
infill_optimizer: InfillOptimizer,
kpls_dim: Option<usize>,
n_clusters: Option<usize>,
n_optmod: usize,
target: f64,
outdir: Option<String>,
hot_start: bool,
Expand All @@ -227,6 +236,7 @@ impl Egor {
infill_optimizer,
kpls_dim,
n_clusters,
n_optmod,
target,
outdir,
hot_start,
Expand Down Expand Up @@ -442,6 +452,7 @@ impl Egor {
.q_points(self.q_points)
.qei_strategy(qei_strategy)
.infill_optimizer(infill_optimizer)
.n_optmod(self.n_optmod)
.target(self.target)
.hot_start(self.hot_start); // when used as a service no hotstart
if let Some(doe) = doe {
Expand Down
Loading