Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error caused when using llama:70b (currently solved by modifying the vanna/ollama/ollama.py file, we hope to fix this problem in subsequent versions) #404

Closed
Mioooooo opened this issue May 2, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@Mioooooo
Copy link

Mioooooo commented May 2, 2024

image

In line 27, if my model is llama:70b, it will be modified to llama:70b:latest

@Mioooooo Mioooooo added the bug Something isn't working label May 2, 2024
@Mioooooo
Copy link
Author

Mioooooo commented May 2, 2024

Currently I modified the code to [in -> not in] and it ran successfully

@zainhoda
Copy link
Contributor

zainhoda commented May 2, 2024

Thanks @Mioooooo — would you like to submit a PR?

@zainhoda
Copy link
Contributor

zainhoda commented May 9, 2024

@Mioooooo did v0.5.3 address this issue for you?

@Mioooooo
Copy link
Author

Mioooooo commented May 9, 2024

@Mioooooo did v0.5.3 address this issue for you?

Yes, I saw and updated the latest version. This problem has been successfully solved. Thank you.

@zainhoda
Copy link
Contributor

zainhoda commented May 9, 2024

Thanks for confirming!

@zainhoda zainhoda closed this as completed May 9, 2024
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants