Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'OpenAI' object has no attribute 'kwargs' #1141

Closed
rohanprasadsap opened this issue Jun 12, 2024 · 3 comments
Closed

AttributeError: 'OpenAI' object has no attribute 'kwargs' #1141

rohanprasadsap opened this issue Jun 12, 2024 · 3 comments

Comments

@rohanprasadsap
Copy link

I am trying to use langchain's implementation of OpenAI instead of dspy's

llm = OpenAI(model_name=model_name)
#llm = dspy.OpenAI(model=model_name, max_tokens=250)

dspy.settings.configure(lm=llm)

I was getting my hands on with the Minimal work example. I have set up keys and other requirements. I have followed all the steps mentioned here.
My openai version is openai 1.30.4

When I compile and evaluate the model. I get this error:

from dspy.teleprompt import BootstrapFewShot

# Set up the optimizer: we want to "bootstrap" (i.e., self-generate) 4-shot examples of our CoT program.
config = dict(max_bootstrapped_demos=4, max_labeled_demos=4)

# Optimize! Use the `gsm8k_metric` here. In general, the metric is going to tell the optimizer how well it's doing.
teleprompter = BootstrapFewShot(metric=gsm8k_metric, **config)
optimized_cot = teleprompter.compile(CoT(), trainset=gsm8k_trainset)

Error:
AttributeError: 'OpenAI' object has no attribute 'kwargs'

What am I missing while defining the llm? How can I resolve this?

@tom-doerr
Copy link
Contributor

What's the drawback of using DSPy's OpenAI client?

@arnavsinghvi11
Copy link
Collaborator

Hi @rohanprasadsap ,

Langchain’s OpenAI LM is currently not supported as a DSPy LM and will not inherit all the related properties needed to run DSPy pipelines with it. Feel free to share any particular use cases for using Langchain’s OpenAI LM over the current dspy.OpenAI LM and we can explore if any changes are needed!

@okhat okhat closed this as completed Jun 27, 2024
@venkatganesh96
Copy link

venkatganesh96 commented Jul 9, 2024

for me it was working you have to use dspy to initialize the llm like the below
turbo = dspy.AzureOpenAI(deployment_id=model_name,
api_key=openai_api_key,
api_base=azure_endpoint,
model_type="chat",
api_version=openai_api_version,
max_tokens=4000,
)
dspy.settings.configure(lm=turbo)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants