Skip to content

Commit

Permalink
[CI] skip vllm_example ray-project#36665
Browse files Browse the repository at this point in the history
Why are these changes needed?
vllm requires cuda to be build, which is not available in our CI gpu docker image.



No CUDA runtime is found, using CUDA_HOME='/usr/local/cuda'
--
  | Traceback (most recent call last):
  | File "/opt/miniconda/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
  | main()
  | File "/opt/miniconda/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
  | json_out['return_val'] = hook(**hook_input['kwargs'])
  | File "/opt/miniconda/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
  | return hook(config_settings)
  | File "/tmp/pip-build-env-b4hl6g1_/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
  | return self._get_build_requires(config_settings, requirements=['wheel'])
  | File "/tmp/pip-build-env-b4hl6g1_/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
  | self.run_setup()
  | File "/tmp/pip-build-env-b4hl6g1_/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 338, in run_setup
  | exec(code, locals())
  | File "<string>", line 24, in <module>
  | RuntimeError: Cannot find CUDA at CUDA_HOME: /usr/local/cuda. CUDA must be available in order to build the package.


Also there is a bug in the build rule cc983fc#r119162333
  • Loading branch information
scv119 committed Jun 21, 2023
1 parent 42f128a commit 6b599ba
Show file tree
Hide file tree
Showing 3 changed files with 1 addition and 15 deletions.
2 changes: 1 addition & 1 deletion .buildkite/pipeline.gpu_large.yml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@
["NO_WHEELS_REQUIRED", "RAY_CI_PYTHON_AFFECTED", "RAY_CI_TUNE_AFFECTED", "RAY_CI_DOC_AFFECTED"]
commands:
- cleanup() { if [ "${BUILDKITE_PULL_REQUEST}" = "false" ]; then ./ci/build/upload_build_info.sh; fi }; trap cleanup EXIT
- DOC_TESTING=1 TRAIN_TESTING=1 TUNE_TESTING=1 INSTALL_VLLM=1 ./ci/env/install-dependencies.sh
- DOC_TESTING=1 TRAIN_TESTING=1 TUNE_TESTING=1 ./ci/env/install-dependencies.sh
- pip install -Ur ./python/requirements/ml/requirements_ml_docker.txt
- ./ci/env/env_info.sh
# Test examples with newer version of `transformers`
Expand Down
5 changes: 0 additions & 5 deletions ci/env/install-dependencies.sh
Original file line number Diff line number Diff line change
Expand Up @@ -394,11 +394,6 @@ install_pip_packages() {
requirements_packages+=("holidays==0.24") # holidays 0.25 causes `import prophet` to fail.
fi

# Additional dependency for vllm.
if [ "${INSTALL_VLLM-}" = 1 ]; then
requirements_packages+=("vllm")
fi

# Data processing test dependencies.
if [ "${DATA_PROCESSING_TESTING-}" = 1 ] || [ "${DOC_TESTING-}" = 1 ]; then
requirements_files+=("${WORKSPACE_DIR}/python/requirements/data_processing/requirements.txt")
Expand Down
9 changes: 0 additions & 9 deletions doc/BUILD
Original file line number Diff line number Diff line change
Expand Up @@ -280,15 +280,6 @@ py_test_run_all_subdirectory(
tags = ["exclusive", "team:ml", "ray_air", "gpu"],
)

py_test(
name = "vllm_example",
size = "large",
include = ["source/serve/doc_code/vllm_example.py"],
exclude = [],
extra_srcs = [],
tags = ["exclusive", "team:serve", "gpu"],
)

py_test(
name = "pytorch_resnet_finetune",
size = "large",
Expand Down

0 comments on commit 6b599ba

Please sign in to comment.