Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed to load source for dependency libc #12190

Closed
YerongLi opened this issue May 26, 2023 · 2 comments
Closed

failed to load source for dependency libc #12190

YerongLi opened this issue May 26, 2023 · 2 comments
Labels
C-bug Category: bug S-triage Status: This issue is waiting on initial triage.

Comments

@YerongLi
Copy link

YerongLi commented May 26, 2023

Problem

I am trying to install old version for transformers with "pip install transformers==3.3.1", when additional tokenizer package needs rust compiler I so followed the steps on this page: https://huggingface.co/docs/tokenizers/installation, then re-running the installation command find that rust could not find "libc":

ransformers==3.3.1) (8.1.3)
Requirement already satisfied: joblib in /scratch/yerong/.conda/envs/reason/lib/python3.11/site-packages (from sacremoses->transformers==3.3.1) (1.2.0)
Building wheels for collected packages: tokenizers
 Building wheel for tokenizers (pyproject.toml) ... error
 error: subprocess-exited-with-error
 × Building wheel for tokenizers (pyproject.toml) did not run successfully.
 │ exit code: 1
 ╰─> [55 lines of output]
     /tmp/pip-build-env-bfzcki16/overlay/lib/python3.11/site-packages/setuptools/dist.py:519: InformationOnly: Normalizing '0.8.1.rc2' to '0.8.1rc2'
       self.metadata.version = self._normalize_version(
     running bdist_wheel
     running build
     running build_py
     creating build
     creating build/lib.linux-x86_64-cpython-311
     creating build/lib.linux-x86_64-cpython-311/tokenizers
     copying tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers
     creating build/lib.linux-x86_64-cpython-311/tokenizers/models
     copying tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/models
     creating build/lib.linux-x86_64-cpython-311/tokenizers/decoders
     copying tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders
     creating build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
     copying tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
     creating build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
     copying tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
     creating build/lib.linux-x86_64-cpython-311/tokenizers/processors
     copying tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/processors
     creating build/lib.linux-x86_64-cpython-311/tokenizers/trainers
     copying tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers
     creating build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
     copying tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers
     copying tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/models
     copying tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders
     copying tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
     copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
     copying tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/processors
     copying tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers
     running build_ext
     running build_rust
     cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib
     warning: unused manifest key: target.x86_64-apple-darwin.rustflags
         Updating crates.io index
     warning: spurious network error (2 tries remaining): bad packet length; class=Net (12)
     error: failed to get `libc` as a dependency of package `tokenizers-python v0.8.1-rc2 (/tmp/pip-install-8814ldyi/tokenizers_cbab6a736f314261bfaeb5b456c7afe8)`
     Caused by:
       failed to load source for dependency `libc`
     Caused by:
       Unable to update registry `https://github.com/rust-lang/crates.io-index`
     Caused by:
       failed to fetch `https://github.com/rust-lang/crates.io-index`
     Caused by:
       error reading from the zlib stream; class=Zlib (5)
     error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib` failed with code 101
     [end of output]
 note: This error originates from a subprocess, and is likely not a problem with pip.
 ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

Steps

  1. curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  2. source "$HOME/.cargo/env"
  3. pip install transformers==3.3.1

Possible Solution(s)

No response

Notes

No response

Version

No response

@YerongLi YerongLi added C-bug Category: bug S-triage Status: This issue is waiting on initial triage. labels May 26, 2023
@epage
Copy link
Contributor

epage commented May 26, 2023

#10303 contains some debugging steps and potential workarounds

Debugging steps

  • Re-run with CARGO_HTTP_DEBUG=true and CARGO_LOG=cargo::ops::registry=trace env variables
  • Does it run with CARGO_NET_GIT_FETCH_WITH_CLI?
  • Does it run with CARGO_HTTP_MULTIPLEXING=false? Some proxies have problems with http2

If either of those last two work for you, they can also serve as workarounds.

Another potential workaround if you have a new enough cargo is to enable sparse registry support which changes how we do network communication and might bypass the root cause of this problem

@YerongLi
Copy link
Author

Thanks.
CARGO_HTTP_DEBUG=true; CARGO_LOG=cargo::ops::registry=trace; pip install transformers==3.3.1
works for python 3.7 (conda) but not python 3.11 (same conda)
Not sure what is the difference between these two though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C-bug Category: bug S-triage Status: This issue is waiting on initial triage.
Projects
None yet
Development

No branches or pull requests

2 participants