Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explain rails not working for chat models #411

Closed
trebedea opened this issue Mar 20, 2024 · 0 comments · Fixed by #412
Closed

Explain rails not working for chat models #411

trebedea opened this issue Mar 20, 2024 · 0 comments · Fixed by #412
Assignees
Labels
bug Something isn't working

Comments

@trebedea
Copy link
Collaborator

Using explain after generating a response with a Langchain chat model is not working, e.g.

info = rails.explain()
info.print_llm_calls_summary()

outputs No LLM calls were made.

The same behavior works OK for completion (non-chat) models.

@trebedea trebedea added the bug Something isn't working label Mar 20, 2024
@trebedea trebedea self-assigned this Mar 20, 2024
drazvan added a commit that referenced this issue Mar 20, 2024
Fix #411 - explain rails not working for chat models
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant