Skip to content

Commit

Permalink
Merge pull request BerriAI#2965 from BerriAI/litellm_fix_key_update
Browse files Browse the repository at this point in the history
fix - delete key from inMemory Cache after /key/update
  • Loading branch information
ishaan-jaff committed Apr 12, 2024
2 parents cd834e9 + beabec6 commit 8ba140b
Showing 1 changed file with 7 additions and 0 deletions.
7 changes: 7 additions & 0 deletions litellm/proxy/proxy_server.py
Original file line number Diff line number Diff line change
Expand Up @@ -4449,6 +4449,13 @@ async def update_key_fn(request: Request, data: UpdateKeyRequest):
response = await prisma_client.update_data(
token=key, data={**non_default_values, "token": key}
)

# Delete - key from cache, since it's been updated!
# key updated - a new model could have been added to this key. it should not block requests after this is done
user_api_key_cache.delete_cache(key)
hashed_token = hash_token(key)
user_api_key_cache.delete_cache(hashed_token)

return {"key": key, **response["data"]}
# update based on remaining passed in values
except Exception as e:
Expand Down

0 comments on commit 8ba140b

Please sign in to comment.