-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I can't use two models with dspy.OpenAI
#915
Comments
dspy.OpenAI
Hi @Su3h7aM , #744 is related but not mergeable yet.
this is bit unclear. Are the generations only from phi3, or does the |
Sorry, I'll try to be more specific.
Here for example, all requests go to
Here they all go to |
Are there any updates on this? Would love to be able to use both GPT-3.5 and GPT-4 within a single DSPy program. |
Thanks for opening this! We released DSPy 2.5 yesterday. I think the new Here's the (very short) migration guide, it should typically take you 2-3 minutes to change the LM definition and you should be good to go: https://github.com/stanfordnlp/dspy/blob/main/examples/migration.ipynb Please let us know if this resolves your issue. I will close for now but please feel free to re-open if the problem persists. |
When I try to use two different models using
dspy.OpenAI
, it ends up using only the last declaration. As in the example below, even usingllama3
it ends up usingphi3
.In this case I am using
dspy.OpenAI
but behind the scenes there are two instances of llama.cpp (It does not appear to be a problem with llama.cpp, as using the API directly the problem does not recur).When I try to use
dspy.OpenAI
for Llama-3 anddspy.OllamaLocal
for Phi-3, everything works correctly.In short: Even using two models, the responses are only generated by the last declared model.
Env:
DSP_CACHEBOOL=false
DSPy: v2.4.5
OpenAI: v1.23.4
Python: v3.11.9
OS: Fedora Linux 40 (KDE Plasma), Linux 6.8.7-300.fc40.x86_64
CPU: AMD Ryzen 7 5700X 8-Core Processor
RAM: 32010 Mi
The text was updated successfully, but these errors were encountered: