Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for local LLMs #8

Open
spicoflorin opened this issue Jun 6, 2024 · 1 comment
Open

Support for local LLMs #8

spicoflorin opened this issue Jun 6, 2024 · 1 comment

Comments

@spicoflorin
Copy link

Hello!

In my opinion, this tool could be very helpful for Data Engineering performing the time consuming such as data cleaning and data preparation.
I have observed that currently supported LLM are the ones from OpenAI. This approach might involve costs from the business perspective.
Therefore I have the following questions:

Is there any plan to support open source LLM as llama?

Thanks, Florin

@zachary62
Copy link
Contributor

Hi Florin,

Yes! The extension would be easy. This is the function for different LLM APIs:

def call_llm_chat(messages, temperature=0.1, top_p=0.1, use_cache=True):

Do you have any open-source LLMs in mind?

From my experiments, LLMs comparable to GPT-4 are preferable (e.g., Claude 3 and Gemini Ultra would also be good). Additionally, the cost is relatively low even with GPT-4. For example, using Cocoon to clean and profile a table costs ~20 cents.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants