Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/root/LLaVA/llava/model/__init__.py) #1208

Open
20191864218 opened this issue Mar 1, 2024 · 8 comments

Comments

@20191864218
Copy link

Question

If I introduce a new package in clip_encoder.py, I get this error. What should I do?Thanks!

@zzxslp
Copy link

zzxslp commented Mar 7, 2024

See this thread: #1101
Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

@20191864218
Copy link
Author

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

thinks!

@hantao-zhou
Copy link

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

thinks!

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

the question still persist after the commands

@SuperStacie
Copy link

Hi hi~ I met the same issue when adding new modules, have you sucessfully solved this problem?

@20191864218
Copy link
Author

Hi hi~ I met the same issue when adding new modules, have you sucessfully solved this problem?

不用管__init__.py文件,在需要导入__init__.py的地方直接把相应的文件导入

@hantao-zhou
Copy link

Recently being tortured by some other tedious job~ just saw the updates
My issue was due to a path pointing to an environment controlled by the system and causing conflicts, so after several printenv, I solved it by correcting the referrals

@foreverhell
Copy link

In llava/init.py, I modify the code
from .model import LlavaLlamaForCausalLM
to
from .model.language_model.llava_llama import LlavaLlamaForCausalLM
and fix it.
I'm not sure if it makes effort for someone else, but I think I should share it.

@boyugou
Copy link

boyugou commented Jun 21, 2024

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

this one addressed my issue (which might be caused also by not creating a separated conda env)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants