This repository contains a FastAPI application for running DiffDock inference tasks. The application is Dockerized for easy deployment.
- Docker
- Docker Compose (optional, for multi-service setups)
-
Build the Docker Image:
docker build -t diffdock-fastapi -f Dockerfile_copy.diffdock-fastapi .
-
Run the Docker Container:
docker run -d -p 8000:8000 --gpus all diffdock-fastapi
-
Create a
docker-compose.yml
:version: '3.8' services: web: build: . ports: - "8000:8000" volumes: - .:/app environment: - PORT=8000 depends_on: - model-inference model-inference: image: your_model_inference_image environment: - MODEL_PATH=/models/your_model.pt # Uncomment if you need a database # db: # image: postgres:latest # environment: # POSTGRES_USER: your_user # POSTGRES_PASSWORD: your_password # POSTGRES_DB: your_db # volumes: # - postgres_data:/var/lib/postgresql/data # Uncomment if you need a volume for the database # volumes: # postgres_data:
-
Run the Containers:
docker-compose up --build
- URL:
/inference/
- Method:
POST
- Content-Type:
multipart/form-data
- Form Data:
pdb_file
: The PDB file of the proteinsdf_file
: The SDF file of the ligandinference_steps
: The number of inference steps (default: 20)samples_per_complex
: The number of samples per complex (default: 10)
- URL:
/inference/zip/
- Method:
POST
- Content-Type:
multipart/form-data
- Form Data:
zip_file
: The zip file containing.pdb
and.sdf
filesconfig
: JSON string of the inference configuration
- URL:
/inference/status/{task_id}
- Method:
GET
- Response:
{ "task_id": "your_task_id", "status": "Progress status or message" }
- URL:
/inference/download/{task_id}
- Method:
GET
- Response: A zip file containing the inference results.
-
Run the Normal Inference Test:
python test_script.py
-
Run the Zip Inference Test:
python test_script_zip.py
This project is licensed under the MIT License.