This repository contains the code for the AllenNLP demo.
We're actively refactoring some bits and pieces of the codebase, you can expect better documentation to land in the near future.
For more information see the AllenNLP project.
You'll need Docker installed on your local machine.
The AllenNLP demo at a high level is composed of two components:
- A JavaScript application for rendering the user-interface. The code for this can be found
in
ui/
. - A series of Python applications that each provide a small HTTP API endpoint for doing interesting
things with a single model. The code for this can be found in
api/
.
There's three ways to run things locally:
-
If you're working on a single model endpoint consult the README in the api directory for more specific instructions.
-
If you're only working on the user-interface, you can start things up by running:
docker-compose -f docker-compose.ui-only.yaml up --build
Once that's complete you'll be able to access your local version by opening https://localhost:8080 in a browser. Changes to the code should be automatically applied.
-
If you're only working on the Permalinks service, you can start things up by running:
docker-compose -f docker-compose.permalinks.yaml up --build
Once that's complete, follow the instructions in the Permalinks README.
-
If you'd like to run an end to end environment that includes the user-interface and a model endpoint, you can do so by running:
MODEL=bidaf_elmo docker-compose up --build
The
MODEL
environment variable specifies which model inapi/
to run locally. The name should match the name of the directory inapi/allenlp_demo
. If the model has a customDockerfile
, set theMODEL_DOCKERFILE
environment variable to the path to that file:MODEL=masked_lm MODEL_DOCKERFILE=allennlp_demo/masked_lm/Dockerfile docker-compose up --build
Once everything's started open https://localhost:8080 in the browser of your choice.
Code changes will be automatically applied, while changes to backend or frontend dependencies require rerunning
docker-compose
.