Skip to content

intellectronica/aoai-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Azure Open AI Proxy

In complex, distributed scenarios, we may have the need to multiplex between multiple Open AI clients and multiple Azure Open AI deployments. In addition, it would be good to be able to track and attribute the cost of the requests per user and/or endpoint.

This does exactly that, by creating a simple proxy HTTP server that sits between the client the Azure Open AI endpoint, dispatches the request to one of several endpoints, and tracks the usage and cost after each request.

See notebook.

About

Azure Open AI multiplexing proxy demo

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages