Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when using Azure OpenAI model v1.0.0 #2333

Closed
JustTryAI opened this issue Jun 25, 2024 · 7 comments · Fixed by #2426
Closed

Error when using Azure OpenAI model v1.0.0 #2333

JustTryAI opened this issue Jun 25, 2024 · 7 comments · Fixed by #2426
Labels
bug Something isn't working

Comments

@JustTryAI
Copy link

Describe the bug
Can't use the Basic Prompting (Hello, World) starter project with the Azure OpenAI model, I'm using the same settings that have worked with previous pre-release builds.
The error message I'm getting is:

Error Building Component
Error building Component Azure OpenAI: build_model not found in AzureChatOpenAIComponent

Browser and Version
Edge Version 126.0.2592.68 (Official build) (64-bit)

To Reproduce
Steps to reproduce the behavior:

  1. Install v1.0.0
  2. Delete OpenAI model
  3. Drag and Drop Azure OpenAI model
  4. Select global variables
  5. Press the play button on Azure OpenAI model

Additional context
Everything worked fine in 1.0.0a60, I have reinstalled a60 again to check that the variables I'm using are correct, and everything is working fine.
Unable to test in v1.0.3 as nothing seems to be working with this build for me, I have clean installed 1.0.3

@dosubot dosubot bot added the bug Something isn't working label Jun 25, 2024
Copy link

dosubot bot commented Jun 25, 2024

Hello @JustTryAI! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing.

The error "build_model not found in AzureChatOpenAIComponent" occurs because the AzureChatOpenAIComponent class does not have a method named build_model. Instead, it has a method named build. To resolve this error, you should call the build method with the appropriate parameters.

Here is the relevant part of the AzureChatOpenAIComponent class:

def build(
    self,
    model: str,
    azure_endpoint: str,
    azure_deployment: str,
    api_key: str,
    api_version: str,
    temperature: float = 0.7,
    max_tokens: Optional[int] = 1000,
) -> BaseLanguageModel:
    try:
        llm = AzureChatOpenAI(
            model=model,
            azure_endpoint=azure_endpoint,
            azure_deployment=azure_deployment,
            api_version=api_version,
            api_key=api_key,
            temperature=temperature,
            max_tokens=max_tokens,
        )
    except Exception as e:
        raise ValueError("Could not connect to AzureOpenAI API.") from e
    return llm

Ensure that you are calling the build method instead of build_model [1][2][3].

To continue talking to Dosu, mention @dosu.

@JustTryAI
Copy link
Author

Hello @JustTryAI! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing.

The error "build_model not found in AzureChatOpenAIComponent" occurs because the AzureChatOpenAIComponent class does not have a method named build_model. Instead, it has a method named build. To resolve this error, you should call the build method with the appropriate parameters.

Here is the relevant part of the AzureChatOpenAIComponent class:

def build(
    self,
    model: str,
    azure_endpoint: str,
    azure_deployment: str,
    api_key: str,
    api_version: str,
    temperature: float = 0.7,
    max_tokens: Optional[int] = 1000,
) -> BaseLanguageModel:
    try:
        llm = AzureChatOpenAI(
            model=model,
            azure_endpoint=azure_endpoint,
            azure_deployment=azure_deployment,
            api_version=api_version,
            api_key=api_key,
            temperature=temperature,
            max_tokens=max_tokens,
        )
    except Exception as e:
        raise ValueError("Could not connect to AzureOpenAI API.") from e
    return llm

Ensure that you are calling the build method instead of build_model [1][2][3].

To continue talking to Dosu, mention @dosu.

@dosu. I don't believe this response is correct as the Azure OpenAI model uses the "model_response" function instead of the build_model function used by other models.

Copy link

dosubot bot commented Jun 25, 2024

You are correct. The Azure OpenAI model component uses the model_response function instead of build_model. The error you're encountering suggests that the component is not correctly configured to use the model_response function.

To resolve this issue, ensure that your configuration and setup are correct. Here are the steps to verify and correct your setup:

  1. Ensure Correct Configuration: Make sure all required fields are correctly configured and provided. Specifically, check the following:

    • Model Name: Ensure you are using one of the supported model names.
    • Azure Endpoint: Provide the correct Azure endpoint URL.
    • Azure Deployment: Specify the correct deployment name.
    • API Version: Use one of the supported API versions.
    • API Key: Ensure the API key is correctly set.
    • Temperature: Set a valid temperature value.
    • Max Tokens: Set a valid maximum number of tokens.
    • Input Value: Provide the input text.
    • System Message: Optionally, provide a system message.
    • Stream: Set the stream option if needed.
  2. Example Configuration:

    config = {
        "model": "gpt-4",
        "azure_endpoint": "https://example-resource.azure.openai.com/",
        "azure_deployment": "your-deployment-name",
        "api_version": "2023-09-01-preview",
        "api_key": "your-api-key",
        "temperature": 0.7,
        "max_tokens": 1000,
        "input_value": "Hello, World!",
        "system_message": "Your system message here",
        "stream": False,
    }
  3. Check for Updates: Ensure you are using the latest version of the langflow library, as there may have been updates or bug fixes that address this issue.

  4. Refer to the AzureChatOpenAISpecsComponent Class: This class provides details on the supported models and API versions. Make sure you are using one of the supported models and API versions listed in the class.

If the issue persists, it may be helpful to review the implementation of the model_response function to ensure it is correctly handling the response generation for the Azure OpenAI model component.

For more details, you can refer to the AzureChatOpenAISpecsComponent class in the langflow library [1][2][3].

To continue talking to Dosu, mention @dosu.

@blucas-accela
Copy link

Same issue. Brand new install and same steps fail.

2024-06-26 09:56:30 - ERROR - base - build_model not found in AzureChatOpenAIComponent base.py:627
Traceback (most recent call last):

                           File "<frozen runpy>", line 198, in _run_module_as_main                                                                                                                                                               
                           File "<frozen runpy>", line 88, in _run_code   

Langflow v1.0.5
Python version = 3.12.3
OS = Mac Sonoma 14.4.1

@nikunjasahoo
Copy link

Tried Langflow for the first time for experimentation purpose and received the following error on building the AzureOpenAI Component.
AttributeError: build_model not found in AzureChatOpenAIComponent
ERROR 2024-06-27 09:03:02 - ERROR - service.py:76
service - Error stopping tracing
service: 'NoneType' object has no
attribute 'cancel'

Verified all parameters are correctly configured.

Langflow v1.0.5
Python v3.12.1
MacOS Sonoma 14.5

@anantyash9
Copy link

I ran into the same issue.
changing the language model function at langflow/src/backend/base/langflow/components/models/AzureOpenAIModel.py fixed it. Rename the the run_model function to build_model to match all the other Language Models.
` outputs = [
Output(display_name="Text", name="text_output", method="text_response"),
Output(display_name="Language Model", name="model_output", method="build_model"),
]

def build_model(self) -> LanguageModel:
    model = self.model
    azure_endpoint = self.azure_endpoint
    azure_deployment = self.azure_deployment
    api_version = self.api_version
    api_key = self.api_key
    temperature = self.temperature
    max_tokens = self.max_tokens
    stream = self.stream

`
install again and copy the static files of the frontend to the correct location wrt your install directory.
This seems to fix the issue.

@blucas-accela
Copy link

#2339 has the fix in it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants