-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to work with ChatOpenAI #1133
Comments
How do you set the API key? |
No, I'm deploying a local LLM with fastchat and using langchain chatopenai for the conversation(chatopenai is worked well), but in dspy, it doesn't work this way! Now I use HFClientVLLM to interact with LLM. If langchain ChatOpenAI is work(not OpenAI), dspy will be more user-friendly |
Hi @ZCDu , how do you set the langchain chatopenai endpoint? If this is through a supported LM provider, it will not work in DSPy, which is why you are getting this error. If you are using HFClientVLLM now, please refer to this documentation to configure it. |
I set local llm with code
use LangChainPredict(prompt, llm) and LangChainModule to run, but failed. |
I want to use chatopenai of langchain core to interact with dspy, but something went wrong. On its run to predict.langchain, the error is as follows
What should I do to fix it?
The text was updated successfully, but these errors were encountered: