Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Add structured_output to Gemini #13840

Open
kim-borgen opened this issue May 30, 2024 · 0 comments
Open

[Feature Request]: Add structured_output to Gemini #13840

kim-borgen opened this issue May 30, 2024 · 0 comments
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized

Comments

@kim-borgen
Copy link

Feature Description

Hi,

Would it be possible to add a structured_output function to the Gemeni class?
https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-gemini/llama_index/llms/gemini/base.py

This can maybe be done by adding a generation_config to the send_message function, something like:

response = chat.send_message(next_msg, generation_config: Optional["genai.types.GenerationConfigDict"] = GenerationConfig(response_mime_type="application/json")

And some magic to add the pydantic object to the message somehow and then parsing the json to the pydantic object after, but I don't know the best approach.

Reason

No response

Value of Feature

Simplifying interaction with gemeni models.

@kim-borgen kim-borgen added enhancement New feature or request triage Issue needs to be triaged/prioritized labels May 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

1 participant