-
-
Notifications
You must be signed in to change notification settings - Fork 609
Insights: jxnl/instructor
September 23, 2024 – September 30, 2024
Overview
Could not load contribution data
Please try again later
1 Release published by 1 person
-
1.5.0
published
Sep 30, 2024
8 Pull requests merged by 3 people
-
Bump version to 1.5
#1028 merged
Sep 30, 2024 -
Fixed up poetry dependencies and google gemini bug with jinja templating
#1023 merged
Sep 26, 2024 -
Added new Response Body article
#1024 merged
Sep 26, 2024 -
Fixed new templating feature throwing an error for gemini
#1021 merged
Sep 26, 2024 -
Regenerated a poetry.lock
#1022 merged
Sep 26, 2024 -
feat: implement jinja templating and rename kwarg to
context
#1011 merged
Sep 25, 2024 -
Expand litellm anthropic compatibility
#958 merged
Sep 23, 2024 -
Updated Caching concepts to update prompt
#998 merged
Sep 23, 2024
7 Pull requests opened by 5 people
-
Docs : Add missing import in documentation example
#1016 opened
Sep 24, 2024 -
docs: move mention of `max_retries` to the correct section
#1017 opened
Sep 24, 2024 -
Added temperature parameter to RequestBody
#1019 opened
Sep 25, 2024 -
fix: clean up cohere templating
#1030 opened
Sep 30, 2024 -
fix: refactor handle_response_model
#1032 opened
Sep 30, 2024 -
Add parse_from_string method to BatchJob
#1033 opened
Sep 30, 2024 -
Update fake-data.md
#1034 opened
Sep 30, 2024
5 Issues closed by 2 people
-
async for Groq + Claude
#1014 closed
Sep 25, 2024 -
Instructor 1.4.0 doesn't support the groq client due to the `.refusal` error
#953 closed
Sep 23, 2024 -
can not support sglang server
#954 closed
Sep 23, 2024 -
Sudden error: openai.BadRequestError: Error code: 400 - Invalid schema for function
#961 closed
Sep 23, 2024 -
LiteLLM incompatibility
#951 closed
Sep 23, 2024
4 Issues opened by 3 people
-
Add new parsing method to BatchJob
#1031 opened
Sep 30, 2024 -
Gemini handling of messages with "role", "parts" seems broken
#1025 opened
Sep 26, 2024 -
Add temperature as a field for RequestBody in batch.py
#1018 opened
Sep 25, 2024 -
Potential Issue with from_streaming_response_async Not Yielding Results
#1015 opened
Sep 24, 2024
4 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Tenacity version out of date
#1000 commented on
Sep 23, 2024 • 0 new comments -
Streaming for Parallel tool execution
#922 commented on
Sep 23, 2024 • 0 new comments -
Lower cost of tokens via type definitions
#768 commented on
Sep 28, 2024 • 0 new comments -
[PROPOSAL] Allow the client to pass a callback that receives arguments Instructor sends to LLM providers
#911 commented on
Sep 23, 2024 • 0 new comments