-
-
Notifications
You must be signed in to change notification settings - Fork 206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAI support for response_format: json_object #373
Comments
The OpenAI API actually has its own validation that checks the word "json" was used in the system or user prompt, so I'll let that raise an error rather than adding my own validation that might not be necessary in the future. |
I'm going to add a |
I'm going to go with |
llm -m gpt-4-turbo '3 names and short bios for pet pelicans' -o json_object 1
llm -m gpt-4-turbo '3 names and short bios for pet pelicans in JSON' -o json_object 1 {
"pelicans": [
{
"name": "Gus",
"bio": "Gus is a curious young pelican with an insatiable appetite for adventure. He's known amongst the dockworkers for playfully snatching sunglasses. Gus spends his days exploring the marina and is particularly fond of performing aerial tricks for treats."
},
{
"name": "Sophie",
"bio": "Sophie is a graceful pelican with a gentle demeanor. She's become somewhat of a local celebrity at the beach, often seen meticulously preening her feathers or posing patiently for tourists' photos. Sophie has a special spot where she likes to watch the sunset each evening."
},
{
"name": "Captain Beaky",
"bio": "Captain Beaky is the unofficial overseer of the bay, with a stern yet endearing presence. As a seasoned veteran of the coastal skies, he enjoys leading his flock on fishing expeditions and is always the first to spot the fishing boats returning to the harbor. He's respected by both his pelican peers and the fishermen alike."
}
]
} |
The GPT-4 options that appear when you run
Results in the following error:
Is forcing the json_object parameter messing up the OpenAI request? |
That's because Try this: model = llm.get_model("gpt-4-turbo-preview")
response = model.prompt(
"Five surprising names for a pet pelican as JSON",
system="Answer like GlaDOS",
seed=0,
json_object=True
)
print(response) Note that you have to include the word "JSON" in your prompt or you'll get a different error back from OpenAI. I got this: {
"surprising_pet_pelican_names": [
{
"name": "Mr. Pockets",
"reason": "Because who would expect a pelican to carry around more than fish in their beak pouch?"
},
{
"name": "Sir Nibsalot",
"reason": "It sounds more like a name suited for a tiny, nippy pet rather than a grand, majestic pelican."
},
{
"name": "Duchess Beaky",
"reason": "A title of nobility for a bird? How preposterously delightful!"
},
{
"name": "Professor Waddles",
"reason": "One would envision a penguin with this name, not a sleek pelican."
},
{
"name": "Dr. Fishenstein",
"reason": "Attributing a doctorate in fish science to a pelican is both absurd and genius."
}
]
} |
@simonw running
Seeing I'm on 0.13.1 if that makes any difference. |
New feature released at DevDay - you can now pass
"response_format": {"type": "json_object"}
to most of the OpenAI models (not GPT-4 Vision yet) to force the result to be returned as valid JSON:https://platform.openai.com/docs/api-reference/chat/create#chat-create-response_format
The text was updated successfully, but these errors were encountered: