Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"AttributeError: module 'huggingface_hub.constants' has no attribute 'HF_HUB_CACHE'" #151

Open
ApollonNishikawa opened this issue Mar 3, 2024 · 5 comments

Comments

@ApollonNishikawa
Copy link

ApollonNishikawa commented Mar 3, 2024

Describe the bug
I encountered an AttributeError when trying to import se_extractor from the openvoice package in Google Colab. The error message indicates that the huggingface_hub.constants module does not have an attribute named HF_HUB_CACHE.

To Reproduce
Steps to reproduce the behavior:

  1. Open a new notebook in Google Colab.
  2. Run the following code:
import os
import torch
from openvoice import se_extractor
from openvoice.api import BaseSpeakerTTS, ToneColorConverter

Expected behavior
I expected the modules to be imported without any issues.

Error message

AttributeError                            Traceback (most recent call last)
[<ipython-input-4-7528a4fc437c>](https://localhost:8080/#) in <cell line: 3>()
      1 import os
      2 import torch
----> 3 from openvoice import se_extractor
      4 from openvoice.api import BaseSpeakerTTS, ToneColorConverter
      5 

9 frames
[/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py](https://localhost:8080/#) in <module>
     94 #
     95 # TODO: clean this for v5?
---> 96 PYTORCH_PRETRAINED_BERT_CACHE = os.getenv("PYTORCH_PRETRAINED_BERT_CACHE", constants.HF_HUB_CACHE)
     97 PYTORCH_TRANSFORMERS_CACHE = os.getenv("PYTORCH_TRANSFORMERS_CACHE", PYTORCH_PRETRAINED_BERT_CACHE)
     98 TRANSFORMERS_CACHE = os.getenv("TRANSFORMERS_CACHE", PYTORCH_TRANSFORMERS_CACHE)

AttributeError: module 'huggingface_hub.constants' has no attribute 'HF_HUB_CACHE'

Environment (please complete the following information):

  • Google Colab T4 GPU runtime
  • Python version: 3.10.12
  • transformers version: 4.38.1
  • huggingface_hub version: 0.17.3

Additional context
Add any other context about the problem here. I followed the official documentation for setting up my project in Google Colab and have not made any modifications to the environment that should affect this import.
Additionally, when I executed !pip install -e ., I encountered dependency errors which might be related to the issue. The error message was as follows:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
chex 0.1.85 requires numpy>=1.24.1, but you have numpy 1.22.0 which is incompatible.
plotnine 0.12.4 requires numpy>=1.23.0, but you have numpy 1.22.0 which is incompatible.
pywavelets 1.5.0 requires numpy<2.0,>=1.22.4, but you have numpy 1.22.0 which is incompatible.
tensorflow 2.15.0 requires numpy<2.0.0,>=1.23.5, but you have numpy 1.22.0 which is incompatible.
torchdata 0.7.0 requires torch==2.1.0, but you have torch 1.13.1 which is incompatible.
torchtext 0.16.0 requires torch==2.1.0, but you have torch 1.13.1 which is incompatible.
torchvision 0.16.0+cu121 requires torch==2.1.0, but you have torch 1.13.1 which is incompatible.
transformers 4.38.1 requires huggingface-hub<1.0,>=0.19.3, but you have huggingface-hub 0.17.3 which is incompatible.
Successfully installed MyShell-OpenVoice-0.0.0 aiofiles-23.2.1 av-10.0.0 cn2an-0.5.22 coloredlogs-15.0.1 ctranslate2-3.24.0 dtw-python-1.3.1 eng_to_ipa-0.0.2 fastapi-0.110.0 faster-whisper-0.9.0 ffmpy-0.3.2 gradio-3.48.0 gradio-client-0.6.1 h11-0.14.0 httpcore-1.0.4 httpx-0.27.0 huggingface-hub-0.17.3 humanfriendly-10.0 langid-1.1.6 librosa-0.9.1 numpy-1.22.0 nvidia-cublas-cu11-11.10.3.66 nvidia-cuda-nvrtc-cu11-11.7.99 nvidia-cuda-runtime-cu11-11.7.99 nvidia-cudnn-cu11-8.5.0.96 onnxruntime-1.17.1 openai-whisper-20231117 orjson-3.9.15 proces-0.1.7 pydub-0.25.1 pypinyin-0.50.0 python-multipart-0.0.9 resampy-0.4.2 semantic-version-2.10.0 starlette-0.36.3 tiktoken-0.6.0 tokenizers-0.14.1 torch-1.13.1 torchaudio-0.13.1 unidecode-1.3.7 uvicorn-0.27.1 wavmark-0.0.2 websockets-11.0.3 whisper-timestamped-1.14.2
WARNING: The following packages were previously imported in this runtime:
  [numpy]
You must restart the runtime in order to use newly installed versions.

This might indicate that there are conflicting dependencies in my environment that could be contributing to the problem.

@ApollonNishikawa ApollonNishikawa changed the title "AttributeError: module 'huggingface_hub.constants' has no attribute 'HF_HUB_CACHE'" when importing openvoice in Google Colab "AttributeError: module 'huggingface_hub.constants' has no attribute 'HF_HUB_CACHE'" Mar 6, 2024
@yanchujian
Copy link

Upgrading the huggingface hub version can solve the problem

@ApollonNishikawa
Copy link
Author

ApollonNishikawa commented Mar 16, 2024

This error seems to occur when the Python version of Google Colab is 3.10.12.

I fixed this issue and submitted a pull request.
It works for google colab current python version 3.10.12.
#156

@MouadBaghdadi
Copy link

I am facing the same error, any solution to suggest ?
I upgraded hugging face and transformers but the problem still there

@MavisHoot
Copy link

I am facing the same error, any solution to suggest ?
I upgraded hugging face and transformers but the problem still th

@amitmathapati
Copy link

Still getting this error, tried with pip install --upgrade huggingface_hub and pip install --upgrade huggingface_hub==0.14.1 but still no luck.

Scourged the internet but still not working

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants