Async Rust library for OpenAI
Logo created by this repo itself
async-openai
is an unofficial Rust library for OpenAI REST API.
- It's based on OpenAI OpenAPI spec
- Current features:
- Audio
- Chat (including SSE streaming)
- Completions (including SSE streaming)
- Edits
- Embeddings
- Files
- Fine-Tuning (including SSE streaming)
- Images
- Microsoft Azure Endpoints
- Models
- Moderations
- All requests including form submissions (except SSE streaming) are retried with exponential backoff when rate limited by the API server.
- Ergonomic Rust library with builder pattern for all request objects.
Note on Azure OpenAI Service: async-openai
primarily implements OpenAI APIs, and exposes same library for Azure OpenAI Service too. In reality Azure OpenAI Service provides only subset of OpenAI APIs.
The library reads API key from the environment variable OPENAI_API_KEY
.
# On macOS/Linux
export OPENAI_API_KEY='sk-...'
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'
- Visit examples directory on how to use
async-openai
. - Visit docs.rs/async-openai for docs.
use async_openai::{
types::{CreateImageRequestArgs, ImageSize, ResponseFormat},
Client,
};
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
// create client, reads OPENAI_API_KEY environment variable for API key.
let client = Client::new();
let request = CreateImageRequestArgs::default()
.prompt("cats on sofa and carpet in living room")
.n(2)
.response_format(ResponseFormat::Url)
.size(ImageSize::S256x256)
.user("async-openai")
.build()?;
let response = client.images().create(request).await?;
// Download and save images to ./data directory.
// Each url is downloaded and saved in dedicated Tokio task.
// Directory is created if it doesn't exist.
let paths = response.save("./data").await?;
paths
.iter()
.for_each(|path| println!("Image file path: {}", path.display()));
Ok(())
}
Thank you for your time to contribute and improve the project, I'd be happy to have you!
A good starting point would be existing open issues.
- openai-func-enums provides procedural macros that make it easier to use this library with OpenAI API's function calling feature. It also provides derive macros you can add to existing clap application subcommands for natural language use of command line tools.
This project is licensed under MIT license.