Kaggle Kernels allow users to run a Python Notebook in the cloud against our competitions and datasets without having to download data or set up their environment.
This repository includes our Dockerfiles for building the CPU-only and GPU image that runs Python Kernels on Kaggle.
Our Python Docker images are stored on Google Container Registry at:
- CPU-only: gcr.io/kaggle-images/python
- GPU: private for now, we will make it public soon.
Note: The base image for the GPU image is our CPU-only image. The gpu.Dockerfile adds a few extra layers to install GPU related libraries and packages (cuda, libcudnn, pycuda etc.) and reinstall packages with specific GPU builds (torch, tensorflow and a few mores).
To get started with this image, read our guide to using it yourself, or browse Kaggle Kernels for ideas.
First, evaluate whether installing the package yourself in your own Kernels suits your needs. See guide.
If you the first step above doesn't work for your use case, open an issue or a pull request.
- Update the Dockerfile
- For changes specific to the GPU image, update the gpu.Dockerfile.
- Otherwise, update the Dockerfile.
- Follow the instructions below to build a new image.
- Add tests for your new package. See this example.
- Follow the instructions below to test the new image.
- Open a PR on this repo and you are all set!
./build
Flags:
--gpu
to build an image for GPU.--use-cache
for faster iterative builds.
A suite of tests can be found under the /tests
folder. You can run the test using this command:
./test
Flags:
--gpu
to test the GPU image.
We are building Tensorflow from sources mainly for:
- Better performance. When building from sources, we can leverage CPU specific optimizations
- Tensorflow with GPU support must be built from sources
The Dockerfile and the instructions can be found in the tensorflow-whl folder/.