-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hugging Face Integration Failure #7492
Comments
Hey, @farzbood! Thanks for reporting the issue. We keep track of this in #7417 and #7418 and fixed the problems in #7425. In the meantime, you can try using the previous version of LMK if this helps... |
Thanks for the quick and kind response. |
Very much looking forward to haystack 2.0.1, thank you for your great work :-) |
Describe the bug
Hi,
I'm new to LLM-App. context, trying to get familiar with the tools and applications in this domain, attempting to use Haystack integration for Hugging Face.
Running the example code published in https://docs.haystack.deepset.ai/docs/huggingfacetgigenerator#in-a-pipeline in a Poetry (.venv) environment with (haystack-ai 2.0.0, transformers 4.39.3, torch 2.2.2) installed, failed due to an Import error.
Error message
Traceback (most recent call last):
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\components\generators\hugging_face_tgi.py", line 13, in
from huggingface_hub.inference._text_generation import TextGenerationResponse, TextGenerationStreamResponse, Token
ModuleNotFoundError: No module named 'huggingface_hub.inference._text_generation'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:/pyprojects/AIprjs/LLM/Haystack/pipelines/rag.py", line 31, in
pipe.add_component("llm", HuggingFaceTGIGenerator(model="mistralai/Mistral-7B-v0.1", token=Secret.from_token("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")))
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\core\component\component.py", line 132, in call
instance = super().call(*args, **kwargs)
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\haystack\components\generators\hugging_face_tgi.py", line 99, in init
transformers_import.check()
File "C:\pyprojects\AIprjs\LLM\Haystack\pipelines.venv\lib\site-packages\lazy_imports\try_import.py", line 107, in check
raise ImportError(message) from exc_value
ImportError: Failed to import 'huggingface_hub.inference._text_generation'. Run 'pip install transformers'. Original error: No module named 'huggingface_hub.inference._text_generation'
Expected behavior
To seamlessly run the pipeline and produce the result!
Additional context
Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.
To Reproduce
Just run the script as is!
FAQ Check
System:
The text was updated successfully, but these errors were encountered: