Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to work with ChatOpenAI #1133

Open
ZCDu opened this issue Jun 12, 2024 · 4 comments
Open

How to work with ChatOpenAI #1133

ZCDu opened this issue Jun 12, 2024 · 4 comments

Comments

@ZCDu
Copy link

ZCDu commented Jun 12, 2024

I want to use chatopenai of langchain core to interact with dspy, but something went wrong. On its run to predict.langchain, the error is as follows

  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/primitives/predict.py", line 77, in do_generate
    completions: list[dict[str, Any]] = generator(prompt, **kwargs)
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/gpt3.py", line 178, in __call__
    response = self.request(prompt, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/backoff/_sync.py", line 105, in retry
    ret = target(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/gpt3.py", line 144, in request
    return self.basic_request(prompt, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/gpt3.py", line 117, in basic_request
    response = chat_request(**kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/gpt3.py", line 263, in chat_request
    return v1_cached_gpt3_turbo_request_v2_wrapped(**kwargs).model_dump()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/cache_utils.py", line 16, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/gpt3.py", line 256, in v1_cached_gpt3_turbo_request_v2_wrapped
    return v1_cached_gpt3_turbo_request_v2(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/joblib/memory.py", line 655, in __call__
    return self._cached_call(args, kwargs)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/joblib/memory.py", line 598, in _cached_call
    out, metadata = self.call(*args, **kwargs)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/joblib/memory.py", line 856, in call
    output = self.func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/dsp/modules/gpt3.py", line 250, in v1_cached_gpt3_turbo_request_v2
    return openai.chat.completions.create(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/openai/_utils/_proxy.py", line 20, in __getattr__
    proxied = self.__get_proxied__()
              ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/openai/_utils/_proxy.py", line 55, in __get_proxied__
    return self.__load__()
           ^^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/openai/_module_client.py", line 12, in __load__
    return _load_client().chat
           ^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/openai/__init__.py", line 323, in _load_client
    _client = _ModuleClient(
              ^^^^^^^^^^^^^^
  File "/Users/zhaoguoqing/.version-fox/cache/python/v-3.12.0/python-3.12.0/lib/python3.12/site-packages/openai/_client.py", line 104, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

What should I do to fix it?

@tom-doerr
Copy link
Contributor

How do you set the API key?

@ZCDu
Copy link
Author

ZCDu commented Jun 13, 2024

No, I'm deploying a local LLM with fastchat and using langchain chatopenai for the conversation(chatopenai is worked well), but in dspy, it doesn't work this way! Now I use HFClientVLLM to interact with LLM. If langchain ChatOpenAI is work(not OpenAI), dspy will be more user-friendly

@arnavsinghvi11
Copy link
Collaborator

Hi @ZCDu , how do you set the langchain chatopenai endpoint? If this is through a supported LM provider, it will not work in DSPy, which is why you are getting this error.

If you are using HFClientVLLM now, please refer to this documentation to configure it.

@ZCDu
Copy link
Author

ZCDu commented Jun 18, 2024

Hi @ZCDu , how do you set the langchain chatopenai endpoint? If this is through a supported LM provider, it will not work in DSPy, which is why you are getting this error.

If you are using HFClientVLLM now, please refer to this documentation to configure it.

I set local llm with code

from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
      temperature=0,  
      base_url="ip:port/v1",  
      api_key="EMPTY",  
      max_tokens=2048,  
      timeout=3  
  ).configurable_fields(  
      model_name = ConfigurableFieldSingleOption(  
          id='modelName',  
          options= {model:model for model in ["qwen-72b"]},  
          default= "qwen-72b"  
      )  
  )

use LangChainPredict(prompt, llm) and LangChainModule to run, but failed.
Thinks, i will try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants