Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parea wrapper not re-raising root exception #166

Closed
seanr-cardless opened this issue Oct 20, 2023 · 2 comments · Fixed by #168
Closed

Parea wrapper not re-raising root exception #166

seanr-cardless opened this issue Oct 20, 2023 · 2 comments · Fixed by #168
Assignees
Labels
bug Something isn't working

Comments

@seanr-cardless
Copy link

seanr-cardless commented Oct 20, 2023

🐛 Bug Report

The Parea wrapper code returns in the finally block which swallows the exception caught when actually calling the OpenAI models. This makes things very difficult to log/monitor/debug as the root exception is swallowed by Parea and failures occur downstream.

Link to offending code.

return self._cleanup_trace(trace_id, start_time, error, cache_hit, args, kwargs, response)

🔬 How To Reproduce

Steps to reproduce the behavior:

  1. I'm not sure how to repro an OpenAI failure but it can be monkey patched if needed. Looking at the offending code probably provides all the context necessary

Code sample

Try running this function. 1 is returned and the exception isn't raised

def run():
    try:
        raise Exception("bad")
    except Exception as e:
        print(e)
        raise e
    finally:
        return 1

Environment

  • OS: MacOS apple silicon
  • Python version: 3.9
python --version

📈 Expected behavior

The error isn't swallowed by Parea and is surfaced to the consumer of the Open API call.

📎 Additional context

I ran into this using Langchain with the following (abbreviated) code

    llm = ChatOpenAI(
        openai_api_key=openai_api_key,
        temperature=0,
        model_name=model,
        model_kwargs=llm_kwargs,
        max_retries=3
    )
    response = llm([HumanMessage(content="model query here...any will work")

This became relevant on the OpenAI outage on 2023-10-19

@seanr-cardless seanr-cardless added the bug Something isn't working label Oct 20, 2023
@joschkabraun joschkabraun self-assigned this Oct 20, 2023
@joschkabraun
Copy link
Contributor

Hey, thanks for raising! I will update you once it's resolved.

@parea-ai parea-ai deleted a comment from github-actions bot Oct 20, 2023
@joschkabraun
Copy link
Contributor

joschkabraun commented Oct 20, 2023

@seanr-cardless it's fixed now with #168!
You can simply run pip install -U parea-ai

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants