-
Notifications
You must be signed in to change notification settings - Fork 718
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Goolge PaLM API models #239
Comments
I am adding the following notebook, which I believe the code example above is derived from: Would be great to have a notebook describing how to do this or similar :] |
Perhaps you want to take a look at the recently added vanna/src/vanna/ollama/__init__.py Line 6 in fb384d4
I added the code below which should give you some ideas on what is required to add support to any other LLM model. Perhaps that is exactly what you were looking for, @hugoferrero? :]
|
Thanks for the response @andreped. I will try it, and send you feddback. |
@hugoferrero if you happen to make progress on this, could you pass along your code and we can potentially integrate this into the main Vanna repo? |
@zainhoda I am open to drafting a PR for :] I can tag you in, @hugoferrero, if you wish to test it before merging. |
I made a PR #264. It is a rather simple implementation but sadly I do not have access to Google Cloud. I am therefore dependent on some of you to test it. To install:
Then you should be able to initialize it with a vector DB like Chroma like so:
|
@andreped I've tried the code and it's not working, it gives errors like: I then tried explicitly mentioning the parameters as below:
This gave me the error: I also attempted to specify the model more explicitly:
This gave me this error: Please help 🙏🏼 |
Hello, @yedhukr! :] Great that you were able to test the implementation! I don't have access to Google Cloud, so I have no way of testing it. Perhaps someone could reach out, and I could borrow some API key such that I could debug this properly? Just for this PR, then the key could be rotated. @zainhoda? |
@andreped Let me know if there's anything else I can do to help! Modifying the function in this manner gets it to run, but I get a response like:
|
Which user prompt did you use? It also sounds like you are missing the system message that Vanna uses. I think by doing this As a test, could you try to feed the system message that the Vanna Base class uses here: Basically, change
If that works, I think I know how to fix the issue. EDIT: It could also be that you just wrote |
@andreped [{'role': 'system', 'content': 'The user provides a question and you provide SQL...;} ... {'role': 'user', 'content': 'What are the top 5 properties by sales in 2023?'}]
|
Hi Guys. Sorry, i can't lend the API KEY. I have a corporate account. |
No problem. I will check around if I can get a new trial. Maybe I just need to setup a new account :P |
Hi. I want to try vanna ai on PaLM API models (bison). Do you have any tutorial or documentation on how to set up those models on vanna?. It is not clear to me how to implement any other model if you choose "Ohter LLM" in the configuration options. Here is the code i can't figure it out how to adapt to PaLM API models:
The text was updated successfully, but these errors were encountered: