Skip to content

Commit

Permalink
[tune] Put examples under proper version control (ray-project#9427)
Browse files Browse the repository at this point in the history
Co-authored-by: krfricke <[email protected]>
  • Loading branch information
richardliaw and krfricke authored Jul 14, 2020
1 parent 7abf7a0 commit a567f79
Show file tree
Hide file tree
Showing 32 changed files with 199 additions and 41 deletions.
40 changes: 20 additions & 20 deletions doc/source/tune/_tutorials/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -148,55 +148,55 @@ If any example is broken, or if you'd like to add an example to this page, feel
General Examples
~~~~~~~~~~~~~~~~

- `async_hyperband_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/async_hyperband_example.py>`__: Example of using a Trainable class with AsyncHyperBandScheduler.
- `hyperband_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperband_example.py>`__: Example of using a Trainable class with HyperBandScheduler. Also uses the Experiment class API for specifying the experiment configuration. Also uses the AsyncHyperBandScheduler.
- `pbt_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_example.py>`__: Example of using a Trainable class with PopulationBasedTraining scheduler.
- `PBT with Function API <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_function.py>`__: Example of using the function API with a PopulationBasedTraining scheduler.
- `pbt_ppo_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_ppo_example.py>`__: Example of optimizing a distributed RLlib algorithm (PPO) with the PopulationBasedTraining scheduler.
- `logging_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/logging_example.py>`__: Example of custom loggers and custom trial directory naming.
- :doc:`/tune/examples/async_hyperband_example`: Example of using a Trainable class with AsyncHyperBandScheduler.
- :doc:`/tune/examples/hyperband_example`: Example of using a Trainable class with HyperBandScheduler. Also uses the Experiment class API for specifying the experiment configuration. Also uses the AsyncHyperBandScheduler.
- :doc:`/tune/examples/pbt_example`: Example of using a Trainable class with PopulationBasedTraining scheduler.
- :doc:`/tune/examples/pbt_function`: Example of using the function API with a PopulationBasedTraining scheduler.
- :doc:`/tune/examples/pbt_ppo_example`: Example of optimizing a distributed RLlib algorithm (PPO) with the PopulationBasedTraining scheduler.
- :doc:`/tune/examples/logging_example`: Example of custom loggers and custom trial directory naming.

Search Algorithm Examples
~~~~~~~~~~~~~~~~~~~~~~~~~

- `Ax example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/ax_example.py>`__: Optimize a Hartmann function with `Ax <https://ax.dev>`_ with 4 parallel workers.
- `HyperOpt Example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperopt_example.py>`__: Optimizes a basic function using the function-based API and the HyperOptSearch (SearchAlgorithm wrapper for HyperOpt TPE).
- `Nevergrad example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/nevergrad_example.py>`__: Optimize a simple toy function with the gradient-free optimization package `Nevergrad <https://github.com/facebookresearch/nevergrad>`_ with 4 parallel workers.
- `Bayesian Optimization example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bayesopt_example.py>`__: Optimize a simple toy function using `Bayesian Optimization <https://github.com/fmfn/BayesianOptimization>`_ with 4 parallel workers.
- :doc:`/tune/examples/ax_example`: Optimize a Hartmann function with `Ax <https://ax.dev>`_ with 4 parallel workers.
- :doc:`/tune/examples/hyperopt_example`: Optimizes a basic function using the function-based API and the HyperOptSearch (SearchAlgorithm wrapper for HyperOpt TPE).
- :doc:`/tune/examples/nevergrad_example`: Optimize a simple toy function with the gradient-free optimization package `Nevergrad <https://github.com/facebookresearch/nevergrad>`_ with 4 parallel workers.
- :doc:`/tune/examples/bayesopt_example`: Optimize a simple toy function using `Bayesian Optimization <https://github.com/fmfn/BayesianOptimization>`_ with 4 parallel workers.

Tensorflow/Keras Examples
~~~~~~~~~~~~~~~~~~~~~~~~~

- `tune_mnist_keras <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tune_mnist_keras.py>`__: Converts the Keras MNIST example to use Tune with the function-based API and a Keras callback. Also shows how to easily convert something relying on argparse to use Tune.
- `pbt_memnn_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_memnn_example.py>`__: Example of training a Memory NN on bAbI with Keras using PBT.
- `Tensorflow 2 Example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tf_mnist_example.py>`__: Converts the Advanced TF2.0 MNIST example to use Tune with the Trainable. This uses `tf.function`. Original code from tensorflow: https://www.tensorflow.org/tutorials/quickstart/advanced
- :doc:`/tune/examples/tune_mnist_keras`: Converts the Keras MNIST example to use Tune with the function-based API and a Keras callback. Also shows how to easily convert something relying on argparse to use Tune.
- :doc:`/tune/examples/pbt_memnn_example`: Example of training a Memory NN on bAbI with Keras using PBT.
- :doc:`/tune/examples/tf_mnist_example`: Converts the Advanced TF2.0 MNIST example to use Tune with the Trainable. This uses `tf.function`. Original code from tensorflow: https://www.tensorflow.org/tutorials/quickstart/advanced


PyTorch Examples
~~~~~~~~~~~~~~~~

- `mnist_pytorch <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/mnist_pytorch.py>`__: Converts the PyTorch MNIST example to use Tune with the function-based API. Also shows how to easily convert something relying on argparse to use Tune.
- `mnist_pytorch_trainable <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/mnist_pytorch_trainable.py>`__: Converts the PyTorch MNIST example to use Tune with Trainable API. Also uses the HyperBandScheduler and checkpoints the model at the end.
- :doc:`/tune/examples/mnist_pytorch`: Converts the PyTorch MNIST example to use Tune with the function-based API. Also shows how to easily convert something relying on argparse to use Tune.
- :doc:`/tune/examples/mnist_pytorch_trainable`: Converts the PyTorch MNIST example to use Tune with Trainable API. Also uses the HyperBandScheduler and checkpoints the model at the end.


XGBoost Example
~~~~~~~~~~~~~~~

- :ref:`XGBoost tutorial <tune-xgboost>`: A guide to tuning XGBoost parameters with Tune.
- `xgboost_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/xgboost_example.py>`__: Trains a basic XGBoost model with Tune with the function-based API and an XGBoost callback.
- :doc:`/tune/examples/xgboost_example`: Trains a basic XGBoost model with Tune with the function-based API and an XGBoost callback.


LightGBM Example
~~~~~~~~~~~~~~~~

- `lightgbm_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/lightgbm_example.py>`__: Trains a basic LightGBM model with Tune with the function-based API and a LightGBM callback.
- :doc:`/tune/examples/lightgbm_example`: Trains a basic LightGBM model with Tune with the function-based API and a LightGBM callback.


Contributed Examples
~~~~~~~~~~~~~~~~~~~~

- `pbt_tune_cifar10_with_keras <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_tune_cifar10_with_keras.py>`__: A contributed example of tuning a Keras model on CIFAR10 with the PopulationBasedTraining scheduler.
- `genetic_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/genetic_example.py>`__: Optimizing the michalewicz function using the contributed GeneticSearch algorithm with AsyncHyperBandScheduler.
- `tune_cifar10_gluon <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tune_cifar10_gluon.py>`__: MXNet Gluon example to use Tune with the function-based API on CIFAR-10 dataset.
- :doc:`/tune/examples/pbt_tune_cifar10_with_keras`: A contributed example of tuning a Keras model on CIFAR10 with the PopulationBasedTraining scheduler.
- :doc:`/tune/examples/genetic_example`: Optimizing the michalewicz function using the contributed GeneticSearch algorithm with AsyncHyperBandScheduler.
- :doc:`/tune/examples/tune_cifar10_gluon`: MXNet Gluon example to use Tune with the function-based API on CIFAR-10 dataset.

Open Source Projects using Tune
-------------------------------
Expand Down
2 changes: 1 addition & 1 deletion doc/source/tune/_tutorials/tune-usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ To leverage GPUs, you must set ``gpu`` in ``resources_per_trial``. This will aut
# If you have 4 CPUs on your machine and 1 GPU, this will run 1 trial at a time.
tune.run(trainable, num_samples=10, resources_per_trial={"cpu": 2, "gpu": 1})
You can find an example of this in the `Keras MNIST example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tune_mnist_keras.py>`__.
You can find an example of this in the :doc:`Keras MNIST example </tune/examples/tune_mnist_keras>`.

.. warning:: If 'gpu' is not set, ``CUDA_VISIBLE_DEVICES`` environment variable will be set as empty, disallowing GPU access.

Expand Down
4 changes: 2 additions & 2 deletions doc/source/tune/api_docs/logging.rst
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ You can then pass in your own logger as follows:
These loggers will be called along with the default Tune loggers. You can also check out `logger.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/logger.py>`__ for implementation details.

An example of creating a custom logger can be found in `logging_example.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/logging_example.py>`__.
An example of creating a custom logger can be found in :doc:`/tune/examples/logging_example`.

.. _trainable-logging:

Expand Down Expand Up @@ -164,7 +164,7 @@ CSVLogger
MLFLowLogger
------------

Tune also provides a default logger for `MLFlow <https://mlflow.org>`_. You can install MLFlow via ``pip install mlflow``. An example can be found `mlflow_example.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/mlflow_example.py>`__. Note that this currently does not include artifact logging support. For this, you can use the native MLFlow APIs inside your Trainable definition.
Tune also provides a default logger for `MLFlow <https://mlflow.org>`_. You can install MLFlow via ``pip install mlflow``. An example can be found in :doc:`/tune/examples/mlflow_example`. Note that this currently does not include artifact logging support. For this, you can use the native MLFlow APIs inside your Trainable definition.

.. autoclass:: ray.tune.logger.MLFLowLogger

Expand Down
14 changes: 7 additions & 7 deletions doc/source/tune/api_docs/schedulers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,23 +32,23 @@ When using schedulers, you may face compatibility issues, as shown in the below
* - :ref:`ASHA <tune-scheduler-hyperband>`
- No
- Yes
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/async_hyperband_example.py>`__
- :doc:`Link </tune/examples/async_hyperband_example>`
* - :ref:`Median Stopping Rule <tune-scheduler-msr>`
- No
- Yes
- :ref:`Link <tune-scheduler-msr>`
* - :ref:`HyperBand <tune-original-hyperband>`
- Yes
- Yes
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperband_example.py>`__
- :doc:`Link </tune/examples/hyperband_example>`
* - :ref:`BOHB <tune-scheduler-bohb>`
- Yes
- Only TuneBOHB
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bohb_example.py>`__
- :doc:`Link </tune/examples/bohb_example>`
* - :ref:`Population Based Training <tune-scheduler-pbt>`
- Yes
- Not Compatible
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_example.py>`__
- :doc:`Link </tune/examples/pbt_example>`

.. _tune-scheduler-hyperband:

Expand All @@ -69,7 +69,7 @@ The `ASHA <https://openreview.net/forum?id=S1Y7OOlRZ>`__ scheduler can be used b
brackets=1)
tune.run( ... , scheduler=asha_scheduler)
Compared to the original version of HyperBand, this implementation provides better parallelism and avoids straggler issues during eliminations. **We recommend using this over the standard HyperBand scheduler.** An example of this can be `found here <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/async_hyperband_example.py>`_.
Compared to the original version of HyperBand, this implementation provides better parallelism and avoids straggler issues during eliminations. **We recommend using this over the standard HyperBand scheduler.** An example of this can be found here: :doc:`/tune/examples/async_hyperband_example`.

Even though the original paper mentions a bracket count of 3, discussions with the authors concluded that the value should be left to 1 bracket. This is the default used if no value is provided for the ``brackets`` argument.

Expand Down Expand Up @@ -141,7 +141,7 @@ Tune includes a distributed implementation of `Population Based Training (PBT) <
When the PBT scheduler is enabled, each trial variant is treated as a member of the population. Periodically, top-performing trials are checkpointed (this requires your Trainable to support :ref:`save and restore <tune-checkpoint>`). Low-performing trials clone the checkpoints of top performers and perturb the configurations in the hope of discovering an even better variation.

You can run this `toy PBT example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_example.py>`__ to get an idea of how how PBT operates. When training in PBT mode, a single trial may see many different hyperparameters over its lifetime, which is recorded in its ``result.json`` file. The following figure generated by the example shows PBT with optimizing a LR schedule over the course of a single experiment:
You can run this :doc:`toy PBT example </tune/examples/pbt_function>` to get an idea of how how PBT operates. When training in PBT mode, a single trial may see many different hyperparameters over its lifetime, which is recorded in its ``result.json`` file. The following figure generated by the example shows PBT with optimizing a LR schedule over the course of a single experiment:

.. image:: /pbt.png

Expand All @@ -157,7 +157,7 @@ This class is a variant of HyperBand that enables the `BOHB Algorithm <https://a

This is to be used in conjunction with the Tune BOHB search algorithm. See :ref:`TuneBOHB <suggest-TuneBOHB>` for package requirements, examples, and details.

An example of this in use can be found in `bohb_example.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bohb_example.py>`_.
An example of this in use can be found here: :doc:`/tune/examples/bohb_example`.

.. autoclass:: ray.tune.schedulers.HyperBandForBOHB

Expand Down
18 changes: 9 additions & 9 deletions doc/source/tune/api_docs/suggestion.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,39 +25,39 @@ Summary
* - :ref:`AxSearch <tune-ax>`
- Bayesian/Bandit Optimization
- [`Ax <https://ax.dev/>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/ax_example.py>`__
- :doc:`/tune/examples/ax_example`
* - :ref:`DragonflySearch <Dragonfly>`
- Scalable Bayesian Optimization
- [`Dragonfly <https://dragonfly-opt.readthedocs.io/>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/dragonfly_example.py>`__
- :doc:`/tune/examples/dragonfly_example`
* - :ref:`SkoptSearch <skopt>`
- Bayesian Optimization
- [`Scikit-Optimize <https://scikit-optimize.github.io>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/skopt_example.py>`__
- :doc:`/tune/examples/skopt_example`
* - :ref:`HyperOptSearch <tune-hyperopt>`
- Tree-Parzen Estimators
- [`HyperOpt <https://hyperopt.github.io/hyperopt>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperopt_example.py>`__
- :doc:`/tune/examples/hyperopt_example`
* - :ref:`BayesOptSearch <bayesopt>`
- Bayesian Optimization
- [`BayesianOptimization <https://github.com/fmfn/BayesianOptimization>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bayesopt_example.py>`__
- :doc:`/tune/examples/bayesopt_example`
* - :ref:`TuneBOHB <suggest-TuneBOHB>`
- Bayesian Opt/HyperBand
- [`BOHB <https://github.com/automl/HpBandSter>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bohb_example.py>`__
- :doc:`/tune/examples/bohb_example`
* - :ref:`NevergradSearch <nevergrad>`
- Gradient-free Optimization
- [`Nevergrad <https://github.com/facebookresearch/nevergrad>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/nevergrad_example.py>`__
- :doc:`/tune/examples/nevergrad_example`
* - :ref:`ZOOptSearch <zoopt>`
- Zeroth-order Optimization
- [`ZOOpt <https://github.com/polixir/ZOOpt>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/zoopt_example.py>`__
- :doc:`/tune/examples/zoopt_example`
* - :ref:`SigOptSearch <sigopt>`
- Closed source
- [`SigOpt <https://sigopt.com/>`__]
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/sigopt_example.py>`__
- :doc:`/tune/examples/sigopt_example`


.. note::Search algorithms will require a different search space declaration than the default Tune format - meaning that you will not be able to combine ``tune.grid_search`` with the below integrations.
Expand Down
6 changes: 6 additions & 0 deletions doc/source/tune/examples/async_hyperband_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

async_hyperband_example
~~~~~~~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/async_hyperband_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/ax_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

ax_example
~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/ax_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/bayesopt_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

bayesopt_example
~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/bayesopt_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/bohb_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

bohb_example
~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/bohb_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/dragonfly_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

dragonfly_example
~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/dragonfly_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/genetic_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

genetic_example
~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/genetic_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/hyperband_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

hyperband_example
=================

.. literalinclude:: /../../python/ray/tune/examples/hyperband_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/hyperopt_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

hyperopt_example
~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/hyperopt_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/lightgbm_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

lightgbm_example
~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/lightgbm_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/logging_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

logging_example
~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/logging_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/mlflow_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

mlflow_example
~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/mlflow_example.py
7 changes: 7 additions & 0 deletions doc/source/tune/examples/mnist_pytorch.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
:orphan:

mnist_pytorch
~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/mnist_pytorch.py

6 changes: 6 additions & 0 deletions doc/source/tune/examples/mnist_pytorch_trainable.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

mnist_pytorch_trainable
~~~~~~~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/mnist_pytorch_trainable.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/nevergrad_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

nevergrad_example
~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/nevergrad_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/pbt_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

pbt_example
~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/pbt_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/pbt_function.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

pbt_function
~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/pbt_function.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/pbt_memnn_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

pbt_memnn_example
~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/pbt_memnn_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/pbt_ppo_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

pbt_ppo_example
~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/pbt_ppo_example.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/pbt_tune_cifar10_with_keras.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

pbt_tune_cifar10_with_keras
~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/pbt_tune_cifar10_with_keras.py
6 changes: 6 additions & 0 deletions doc/source/tune/examples/sigopt_example.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
:orphan:

sigopt_example
~~~~~~~~~~~~~~

.. literalinclude:: /../../python/ray/tune/examples/sigopt_example.py
Loading

0 comments on commit a567f79

Please sign in to comment.