Skip to content
This repository has been archived by the owner on Mar 21, 2024. It is now read-only.

Commit

Permalink
📝 Add quick setup, add review comment changes
Browse files Browse the repository at this point in the history
  • Loading branch information
peterhessey committed Jun 22, 2022
1 parent 3384cd8 commit fc743ab
Show file tree
Hide file tree
Showing 3 changed files with 41 additions and 75 deletions.
60 changes: 29 additions & 31 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,33 +44,31 @@ architecture.

Once training in AzureML is done, the models can be deployed from within AzureML.

## Getting started
## Quick Setup

### Set up InnerEye
This quick setup assumes you are using a machine running Ubuntu with Git, Git LFS, Conda and Python 3.7+ installed. Please refer to the [setup guide](docs/environment.md) for more detailed instructions on getting InnerEye set up with other operating systems and installing the above prerequisites.

Please refer to our [setup guide](docs/environment.md) for instructions on getting InnerEye-DeepLearning set up on your device.
### Instructions

### Run HelloWorld Model
1. Clone the InnerEye-DeepLearning repo by running the following command:

Now try to run the `HelloWorld` segmentation model - that's a very simple model that will train for 2 epochs on any
machine, no GPU required. You need to set the `PYTHONPATH` environment variable to point to the repository root first.
Assuming that your current directory is the repository root folder, on Linux `bash` that is:
```shell
git clone --recursive https://github.com/microsoft/InnerEye-DeepLearning & cd InnerEye-DeepLearning
```

```shell
export PYTHONPATH=`pwd`
python InnerEye/ML/runner.py --model=HelloWorld
```
2. Create and activate your conda environment:

(Note the "backtick" around the `pwd` command, this is not a standard single quote!)
```shell
conda env create --file environment.yml && conda activate InnerEye
```

On Windows:
3. Verify that your installation was successful by running the HelloWorld model (no GPU required):

```shell
set PYTHONPATH=%cd%
python InnerEye/ML/runner.py --model=HelloWorld
```
```shell
python InnerEye/ML/runner.py --model=HelloWorld
```

If that works: Congratulations! You have successfully built your first model using the InnerEye toolbox.
If the above runs with no errors: Congratulations! You have successfully built your first model using the InnerEye toolbox.

If it fails, please check the
[troubleshooting page on the Wiki](https://github.com/microsoft/InnerEye-DeepLearning/wiki/Issues-with-code-setup-and-the-HelloWorld-model).
Expand All @@ -80,17 +78,17 @@ If it fails, please check the
Further detailed instructions, including setup in Azure, are here:

1. [Setting up your environment](docs/environment.md)
2. [Setting up Azure Machine Learning](docs/setting_up_aml.md)
3. [Training a simple segmentation model in Azure ML](docs/hello_world_model.md)
4. [Creating a dataset](docs/creating_dataset.md)
5. [Building models in Azure ML](docs/building_models.md)
6. [Sample Segmentation and Classification tasks](docs/sample_tasks.md)
7. [Debugging and monitoring models](docs/debugging_and_monitoring.md)
8. [Model diagnostics](docs/model_diagnostics.md)
9. [Move a model to a different workspace](docs/move_model.md)
10. [Working with FastMRI models](docs/fastmri.md)
11. [Active label cleaning and noise robust learning toolbox](https://github.com/microsoft/InnerEye-DeepLearning/blob/1606729c7a16e1bfeb269694314212b6e2737939/InnerEye-DataQuality/README.md)
12. [Using InnerEye as a git submodule](docs/innereye_as_submodule.md)
1. [Setting up Azure Machine Learning](docs/setting_up_aml.md)
1. [Training a simple segmentation model in Azure ML](docs/hello_world_model.md)
1. [Creating a dataset](docs/creating_dataset.md)
1. [Building models in Azure ML](docs/building_models.md)
1. [Sample Segmentation and Classification tasks](docs/sample_tasks.md)
1. [Debugging and monitoring models](docs/debugging_and_monitoring.md)
1. [Model diagnostics](docs/model_diagnostics.md)
1. [Move a model to a different workspace](docs/move_model.md)
1. [Working with FastMRI models](docs/fastmri.md)
1. [Active label cleaning and noise robust learning toolbox](https://github.com/microsoft/InnerEye-DeepLearning/blob/1606729c7a16e1bfeb269694314212b6e2737939/InnerEye-DataQuality/README.md)
1. [Using InnerEye as a git submodule](docs/innereye_as_submodule.md)

## Deployment

Expand Down Expand Up @@ -157,6 +155,6 @@ This project has adopted the [Microsoft Open Source Code of Conduct](https://ope
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
contact [[email protected]](mailto:[email protected]) with any additional questions or comments.

## This toolbox is maintained by the
## Maintenance

[Microsoft Medical Image Analysis team](https://www.microsoft.com/en-us/research/project/medical-image-analysis/).
This toolbox is maintained by the [Microsoft Medical Image Analysis team](https://www.microsoft.com/en-us/research/project/medical-image-analysis/).
34 changes: 0 additions & 34 deletions docs/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,37 +49,3 @@ machines that have a higher CO2 footprint than your dev box.
- DO provide details for issues you create
- Describe the expected and actual behavior.
- Provide any relevant exception message.

## Using the hi-ml package

To work on `hi-ml` package at the same time as `InnerEye-DeepLearning`, you can edit `hi-ml` in the git submodule which is automatically cloned as part of the [setup guide](environment.md).

- In the repository root, run `git submodule add https://github.com/microsoft/hi-ml`
- In PyCharm's project browser, mark the folders `hi-ml/hi-ml/src` and `hi-ml/hi-ml-azure/src` as Sources Root
- Remove the entry for the `hi-ml` and `hi-ml-azure` packages from `environment.yml`
- There is already code in `InnerEye.Common.fixed_paths.add_submodules_to_path` that will pick up the submodules and
add them to `sys.path`.

Once you are done testing your changes:

- Remove the entry for `hi-ml` from `.gitmodules`
- Execute these steps from the repository root:

```shell
git submodule deinit -f hi-ml
rm -rf hi-ml
rm -rf .git/modules/hi-ml
```

Alternatively, you can consume a developer version of `hi-ml` from `test.pypi`:

- Remove the entry for the `hi-ml` package from `environment.yml`
- Add a section like this to `environment.yml`, to point pip to `test.pypi`, and a specific version of th package:

```yaml
...
- pip:
- --extra-index-url https://test.pypi.org/simple/
- hi-ml==0.1.0.post236
...
```
22 changes: 12 additions & 10 deletions docs/environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Operating System

We recommend using our toolbox with [Ubuntu 20.04 LTS](https://releases.ubuntu.com/20.04/). Most core InnerEye functionality will be stable on other operating systems, but PyTorch's full feature set is only available on Linux. All jobs in AzureML, both training and inference, run from an Ubuntu 20.04 Docker image. This means that using Ubuntu 20.04 locally allows for maximum reproducibility between your local and AzureML environmnets.
We recommend using our toolbox with [Ubuntu 20.04 LTS](https://releases.ubuntu.com/20.04/). Most core InnerEye functionality will be stable on other operating systems, but PyTorch's full feature set is only available on Linux. All jobs in AzureML, both training and inference, run from an Ubuntu 20.04 Docker image. This means that using Ubuntu 20.04 locally allows for maximum reproducibility between your local and AzureML environments.

For Windows users, Ubuntu can be set up with [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/en-us/windows/wsl/install). Please refer to the [InneryEye WSL docs](docs/WSL.md) for more detailed instructions on getting WSL set up.

Expand Down Expand Up @@ -30,11 +30,13 @@ To view and edit the InnerEye code, we recommended using the [VSCode](https://co

[Conda](https://docs.conda.io/en/latest/) is an open source package management system. It is used in InnerEye to manage all python packages. Follow the instructions in this section to get it set up on your machine.

### Install build tools
### Prerequisite - Install build tools

In order to create the Conda environment you will need to have build tools installed on your machine. If you are running Windows or MacOS, they will automatically be installed with your Conda distribution.
In order to create the Conda environment you will need to have the appropriate build tools installed on your machine. To do this, run the commands relevant to your operating system from the subsections below.

For Linux distributions, use the commands given below.
#### MacOS / Windows Users

If you are running Windows or MacOS, they will automatically be installed with your Conda distribution and you can safely skip this step.

#### Ubuntu / Debian

Expand All @@ -57,11 +59,11 @@ Check if you already have Conda installed by running `conda --version` in your s

## Create a Conda Environment

There are three important files in this repo for creating conda environments:
There are three important files in this repo for creating Conda environments:

- **`primary_deps.yml`** - This file contains the list of primary package dependencies, and can be used to create an environment on any OS.
- **`environment.yml`** - **DO NOT EDIT THIS FILE MANUALLY**. This file is a *lockfile* - it contains a locked list of primary and secondary dependencies that is used to create the environments for AzureML jobs and local Ubuntu environments. As such it contains Ubuntu-specific platform dependencies and cannot be used to create environments on other operating systems.
- **`environment_win.yml`** - This is another lockfile, containing a Windows-dependent list of primary and secondary dependencies. This file can be used to create a conda environment on windows machines.
- **`environment_win.yml`** - This is another lockfile, containing a Windows-dependent list of primary and secondary dependencies. This file can be used to create a Conda environment on windows machines.

### Create environment from lockfile (Ubuntu / Windows)

Expand All @@ -84,7 +86,7 @@ To create an environment from one of the lockfiles, run the following command, s

### Create non-locked environment (MacOS / all other operating systems)

For all other operating systems, no locked environment is provided. Instead, a new conda environment can be created from the primary dependencies using the following commands:
For all other operating systems, no locked environment is provided. Instead, a new Conda environment can be created from the primary dependencies using the following commands:

```shell
conda env create --file primary_deps.yml
Expand All @@ -96,7 +98,7 @@ For all other operating systems, no locked environment is provided. Instead, a n

### Upgrade / Add Python packages in environment

If you wish to alter the packages in your local conda environment, this can be done by editing the `primary_deps.yml` file with your desired changes and then following the instructions relevant to your OS given in the subsections below.
If you wish to alter the packages in your local Conda environment, this can be done by editing the `primary_deps.yml` file with your desired changes and then following the instructions relevant to your OS given in the subsections below.

If you want to change versions of packages used in the AzureML environment, *this can only be done from an Ubuntu machine*, and is facilited through the provided script `create_and_lock_environment.sh`, instructions for which are given in the Ubuntu subsection below.

Expand All @@ -109,7 +111,7 @@ If you want to change versions of packages used in the AzureML environment, *thi
bash -i create_and_lock_environment.sh
```

This script will create/update your local conda environment with your desired primary package versions, as well as a new `environment.yml` which can be ingested by AzureML to create a copy of your local environment.
This script will create/update your local Conda environment with your desired primary package versions, as well as a new `environment.yml` which can be ingested by AzureML to create a copy of your local environment.

#### All other operating systems

Expand Down Expand Up @@ -143,7 +145,7 @@ In order to enable PyTorch to use CUDA, you need to make sure that you have
1. Compatible graphics card with CUDA compute capability of at least 3.0 (at the moment of writing). You can check the compatibility list [on the NVIDA Developer site](https://developer.nvidia.com/cuda-gpus)
1. Recent NVIDIA drivers installed

A quick way to check if PyTorch can use the underlying GPU for computation is to run the following line from your conda environment with all InnerEye packages installed:
A quick way to check if PyTorch can use the underlying GPU for computation is to run the following line from your Conda environment with all InnerEye packages installed:
`python -c 'import torch; print(torch.cuda.is_available())'`
It will output `True` if CUDA computation is available and `False` if it's not.

Expand Down

0 comments on commit fc743ab

Please sign in to comment.