-
Notifications
You must be signed in to change notification settings - Fork 891
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sqlcoder LLM support? #303
Comments
I've used the 7b of sqlcoder via Ollama and found it to be extremely slow for some reason compared to models like mistral. I think if we use sqlcoder 70b it pretty much has to be via some API. Is there an API you were thinking of using? |
Here's a benchmark that I ran: For the ones in purple, they were set up like this: class Vanna_Ollama(ChromaDB_VectorStore, Ollama):
def __init__(self, config=None):
ChromaDB_VectorStore.__init__(self, config=config)
Ollama.__init__(self, config=config)
vn = Vanna_Ollama(config={'model': 'sqlcoder', 'path': path}) I'm not sure we need to do anything additional for running locally |
There is documentation here on which API to use: I can do some simple performance benchmarks, if you'd like. If possible, I can do this benchmark in CoLab. I have managed to run 7B models in CoLab before, but could be that this model goes beyond the limit (RAM or VRAM). EDIT: Could you share me the exact code you used to reproduce the benchmark for sqlcoder, as well as which dataset you used? Perhaps it was public? |
I assume your benchmark runs the vanna functions as is without catering to the prompt format of the sql expert open LLMs hence the poor performance. |
Is your feature request related to a problem? Please describe.
There has been added support to several proprietary and open-source LLMs to Vanna.
However, it seems like one open-source LLM variant outperforms LLMs like GPT-4 and Claude-2.0 on SQL completion tasks:
I think it would be highly relevant to the community to add official support for it in the framework.
Even a 7B param model outperforms GPT-4. Hence, for SQL completion tasks, this model seems like a nobrainer to use:
https://github.com/defog-ai/sqlcoder
Describe the solution you'd like
Different sqlcoder LLMs can be used through a common API, similarly to Ollama:
https://github.com/vanna-ai/vanna/blob/main/src/vanna/ollama/ollama.py
@zainhoda I can make a PR to add support for this LLM.
The text was updated successfully, but these errors were encountered: