CI | Test | |
Style | ||
Doc | ||
Doc | Readthedocs | |
Checks | Code style | |
Types | ||
Build | ||
Install | Pip | |
Conda | ||
Github |
You can make pigs fly
, [Kolter&Madry, 2018]
skwdro
is a Python package that offers WDRO versions for a large range of estimators, either by extending scikit-learn
estimator or by providing a wrapper for pytorch
modules.
Have a look at skwdro
documentation!
First install hatch
and clone the archive. In the root folder, make shell
gives you an interactive shell in the correct environment and make test
runs the tests (it can be launched from both an interactive shell and a normal shell).
make reset_env
removes installed environments (useful in case of troubles).
skwdro
is now available on PyPi.
Run the following command to get the latest version of the package
pip install -U skwdro
It is also available on anaconda.org and can be installed using, for instance:
conda install -c conda-forge -c flvincen -c pytorch skwdro
Robust estimators from skwdro
can be used as drop-in replacements for scikit-learn
estimators (they actually inherit from scikit-learn
estimators and classifier classes.). skwdro
provides robust estimators for standard problems such as linear regression or logistic regression. LinearRegression
from skwdro.linear_model
is a robust version of LinearRegression
from scikit-learn
and be used in the same way. The only difference is that now an uncertainty radius rho
is required.
We assume that we are given X_train
of shape (n_train, n_features)
and y_train
of shape (n_train,)
as training data and X_test
of shape (n_test, n_features)
as test data.
from skwdro.linear_model import LinearRegression
# Uncertainty radius
rho = 0.1
# Fit the model
robust_model = LinearRegression(rho=rho)
robust_model.fit(X_train, y_train)
# Predict the target values
y_pred = robust_model.predict(X_test)
You can refer to the documentation to explore the list of skwdro
's already-made estimators.
Didn't find an estimator that suits you? You can compose your own using the pytorch
interface: it allows more flexibility, custom models and optimizers.
Assume now that the data is given as a dataloader train_loader
.
import torch
import torch.nn as nn
import torch.optim as optim
from skwdro.torch import robustify
# Uncertainty radius
rho = 0.1
# Define the model
model = nn.Linear(n_features, 1)
# Define the loss function
loss_fn = nn.MSELoss()
# Define a sample batch for initialization
sample_batch_x, sample_batch_y = next(iter(train_loader))
# Robust loss
robust_loss = robustify(loss_fn, model, rho, sample_batch_x, sample_batch_y)
# Define the optimizer
optimizer = optim.Adam(model.parameters(), lr=0.01)
# Training loop
for epoch in range(100):
for batch_x, batch_y in train_loader:
optimizer.zero_grad()
loss = robust_loss(batch_x, batch_y)
loss.backward()
optimizer.step()
You will find detailed description on how to robustify
modules in the documentation.