Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

CPU Launcher fails available check with venv #3049

Open
ElfoLiNk opened this issue Mar 27, 2024 · 3 comments
Open

CPU Launcher fails available check with venv #3049

ElfoLiNk opened this issue Mar 27, 2024 · 3 comments

Comments

@ElfoLiNk
Copy link

ElfoLiNk commented Mar 27, 2024

馃悰 Describe the bug

cpu launcher doesn't work

Error logs

2024-03-28T08:00:31,792 [DEBUG] W-9000-embeddings_1.1 org.pytorch.serve.wlm.WorkerLifeCycle - launcherAvailable cmdline: [python, -m, torch.backends.xeon.run_cpu, --use_logical_core, --enable_jemalloc, --no_python, hostname]
2024-03-28T08:00:31,821 [WARN ] W-9000-embeddings_1.1 org.pytorch.serve.wlm.WorkerLifeCycle - torch.backends.xeon.run_cpu is not available. Proceeding without worker core pinning. For better performance, please make sure torch.backends.xeon.run_cpu is available.

launcherAvailable cmdline is running outside the virtual env of torchserve like the below command which fails

python -m torch.backends.xeon.run_cpu --use_logical_core --no_python hostname
/usr/bin/python: Error while finding module specification for 'torch.backends.xeon.run_cpu' (ModuleNotFoundError: No module named 'torch')

Instead using the virtual env where the torchserve is running works

[rocky@hostname ~]$ source /opt/venv-torchserve/bin/activate
(venv-torchserve) [rocky@hostname ~]$ python -m torch.backends.xeon.run_cpu --use_logical_core --no_python hostname
2024-03-28 08:04:22,935 - __main__ - INFO - Use TCMalloc memory allocator
2024-03-28 08:04:22,936 - __main__ - INFO - OMP_NUM_THREADS=4
2024-03-28 08:04:22,936 - __main__ - INFO - Using Intel OpenMP
2024-03-28 08:04:22,936 - __main__ - INFO - KMP_BLOCKTIME=1
2024-03-28 08:04:22,936 - __main__ - INFO - LD_PRELOAD=/opt/venv-torchserve/lib/libiomp5.so:/usr/lib64/libtcmalloc.so
2024-03-28 08:04:22,936 - __main__ - INFO - numactl -C 0-3 -m 0 hostname

Installation instructions

Torchserve 0.10.0 installed in a virtual env with pip

Model Packaging

not applicable

config.properties

ipex_enable=true
cpu_launcher_enable=true
cpu_launcher_args=--use_logical_core

Versions

TorchServe 0.10.0
Python 3.9

Repro instructions

Start a torchserve from a virtual env with cpu_launcher_enable true

Possible Solution

i think https://github.com/pytorch/serve/blob/v0.10.0/frontend/server/src/main/java/org/pytorch/serve/wlm/WorkerLifeCycle.java#L69 should be updated to use python environment used by the running torchserve

@lxning
Copy link
Collaborator

lxning commented Mar 27, 2024

@ElfoLiNk
Copy link
Author

Hi @lxning, I already have the ipex_enable=true. I updated the bug adding more info

@agunapal
Copy link
Collaborator

cc @min-jean-cho

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants