Gemini-OpenAI-Proxy is a proxy software. It is designed to convert OpenAI API protocol calls into Google Gemini Pro protocol, so that software using OpenAI protocol can use Gemini Pro model without perception.
If you're interested in using Google Gemini but don't want to modify your software, Gemini-OpenAI-Proxy is a great option. It allows you to easily integrate the powerful features of Google Gemini without having to do any complex development work.
Get api key from https://makersuite.google.com/app/apikey
✅ Gemini Pro
curl -s https://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello, Who are you?"}],
"temperature": 0.7
}'
✅ Gemini Pro Vision
curl -s https://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4-vision-preview",
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "What do you see in this picture?"
},
{
"type": "image_url",
"image_url": {
"url": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAADAAAAAnAgMAAAA0vyM3AAAACVBMVEX/4WwCAgF3aTMcpbzGAAAAa0lEQVR4nGOgAWB1QOYEIHFEcXKmhCBxQqYgcSLEEGymAFEEhzFAFYmTwNoA53A6IDmB1YETidPAiLBVFGgEgrNqJYIzNTQU4Z5QZA6QNQ3hGpAZcNegceBOADFQOQlQDhfQyUwLkPxKVwAABbkRCcDA66QAAAAASUVORK5CYII="
}
}
]
}
],
"stream": false
}'
-
/v1/chat/completions
- stream
- complete
OpenAI Model | Gemini Model |
---|---|
gpt-3.5-turbo | gemini-1.0-pro-latest |
gpt-4 | gemini-1.5-pro-latest |
gpt-4-vision-preview | gemini-1.0-pro-vision-latest |
gpt-4-turbo | gemini-1.5-pro-latest |
gpt-4o | gemini-1.5-flash-latest |
gpt-4-turbo-preview | gemini-1.5-pro-latest |
...others | gemini-1.0-pro-latest |
build command
npm run build:cf_worker
Copy main_cloudflare-workers.mjs
to
cloudflare-workers
build command
npm run build:deno
Copy main_deno.mjs
to deno deploy
build command
npm run build:cf_worker
- Alternatively can be deployed with cli:
vercel deploy
- Serve locally:
vercel dev
- Vercel Functions limitations (with Edge runtime)
deno task start:deno
npm install && npm run start:node
bun run start:bun
docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:deno
## or
docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:bun
## or
docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:node