# PROXY和SOCKS5为代理地址
# TIMEOUT为请求接口超时时间,单位s
# web_search是否开启联网搜索
# provider Api接口供应商
-
3.点击部署的服务,进入Settings->Networking->Generate Domain(生成随机域名)/Custom Domain(绑定自定义自己的域名),然后就可以通过这个域名进行原神启动!
Tips: 如果你不是新用户,五刀用完了,别慌,点击右上角头像进入settings->General->拉到最下面Delete Account,对,这是注销解绑,然后再重新用github登录绑定,然后你会发现,又有五刀了
docker pull mouxan/g4f
docker run -d --restart always --name gpt4free \
-p 8080:80 \
-e PROXY=https://127.0.0.1:3333 \
-e TIMEOUT=60 \
-e web_search=true \
-e provider='Bing' \
mouxan/g4f
version: '3'
services:
gpt4free:
container_name: gpt4free
image: mouxan/g4f:latest
restart: always
environment:
PROXY: https://127.0.0.1:3333
TIMEOUT: 60
web_search: true
provider: 'Bing'
ports:
- 8080:80
git clone https://github.com/mouxangithub/gpt4free.git
cd gpt4free
pip install -r requirements.txt
python -m cli all
https://127.0.0.1/chat/
- completions接口
curl --location 'https://127.0.0.1/v1/chat/completions' \
--header 'Content-Type: application/json' \
--data '{
"web_search": true,
"provider": "Bing",
"model": "gpt-4",
"messages": [{"role": "user", "content": "hi"}]
}'
# web_search为是否开启联网搜索,目前gpt4模型兼容性好一些,部分gpt3.5好像无法进行联网搜索
# provider API供应商名称,可请求查看下方/v1/providers查看有哪些供应商名称,可指定working为true的供应商,亦可直接环境变量设置,不传则默认随机请求,如果a供应商接口无法请求则会继续b供应商然后继续往下,但如果传了该参数则无法继续请求
- providers接口,查看供应商列表
curl --location 'https://127.0.0.1/v1/providers/<provider_name>'
- models接口,查看api支持的模型,以及模型所对应的供应商
curl --location 'https://127.0.0.1/v1/models/<model_name>'
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
bing.com | g4f.Provider.Bing |
❌ | ✔️ | ✔️ | ❌ | |
chat.geekgpt.org | g4f.Provider.GeekGpt |
✔️ | ✔️ | ✔️ | ❌ | |
gptchatly.com | g4f.Provider.GptChatly |
✔️ | ✔️ | ❌ | ❌ | |
liaobots.site | g4f.Provider.Liaobots |
✔️ | ✔️ | ✔️ | ❌ | |
www.phind.com | g4f.Provider.Phind |
❌ | ✔️ | ✔️ | ❌ | |
raycast.com | g4f.Provider.Raycast |
✔️ | ✔️ | ✔️ | ✔️ |
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
www.aitianhu.com | g4f.Provider.AItianhu |
✔️ | ❌ | ✔️ | ❌ | |
chat3.aiyunos.top | g4f.Provider.AItianhuSpace |
✔️ | ❌ | ✔️ | ❌ | |
e.aiask.me | g4f.Provider.AiAsk |
✔️ | ❌ | ✔️ | ❌ | |
chat-gpt.org | g4f.Provider.Aichat |
✔️ | ❌ | ❌ | ❌ | |
www.chatbase.co | g4f.Provider.ChatBase |
✔️ | ❌ | ✔️ | ❌ | |
chatforai.store | g4f.Provider.ChatForAi |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt.ai | g4f.Provider.ChatgptAi |
✔️ | ❌ | ✔️ | ❌ | |
chatgptx.de | g4f.Provider.ChatgptX |
✔️ | ❌ | ✔️ | ❌ | |
chat-shared2.zhile.io | g4f.Provider.FakeGpt |
✔️ | ❌ | ✔️ | ❌ | |
freegpts1.aifree.site | g4f.Provider.FreeGpt |
✔️ | ❌ | ✔️ | ❌ | |
gptalk.net | g4f.Provider.GPTalk |
✔️ | ❌ | ✔️ | ❌ | |
ai18.gptforlove.com | g4f.Provider.GptForLove |
✔️ | ❌ | ✔️ | ❌ | |
gptgo.ai | g4f.Provider.GptGo |
✔️ | ❌ | ✔️ | ❌ | |
hashnode.com | g4f.Provider.Hashnode |
✔️ | ❌ | ✔️ | ❌ | |
app.myshell.ai | g4f.Provider.MyShell |
✔️ | ❌ | ✔️ | ❌ | |
noowai.com | g4f.Provider.NoowAi |
✔️ | ❌ | ✔️ | ❌ | |
chat.openai.com | g4f.Provider.OpenaiChat |
✔️ | ❌ | ✔️ | ✔️ | |
theb.ai | g4f.Provider.Theb |
✔️ | ❌ | ✔️ | ✔️ | |
sdk.vercel.ai | g4f.Provider.Vercel |
✔️ | ❌ | ✔️ | ❌ | |
you.com | g4f.Provider.You |
✔️ | ❌ | ✔️ | ❌ | |
chat9.yqcloud.top | g4f.Provider.Yqcloud |
✔️ | ❌ | ✔️ | ❌ | |
chat.acytoo.com | g4f.Provider.Acytoo |
✔️ | ❌ | ✔️ | ❌ | |
aibn.cc | g4f.Provider.Aibn |
✔️ | ❌ | ✔️ | ❌ | |
ai.ls | g4f.Provider.Ails |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt4online.org | g4f.Provider.Chatgpt4Online |
✔️ | ❌ | ✔️ | ❌ | |
chat.chatgptdemo.net | g4f.Provider.ChatgptDemo |
✔️ | ❌ | ✔️ | ❌ | |
chatgptduo.com | g4f.Provider.ChatgptDuo |
✔️ | ❌ | ❌ | ❌ | |
chatgptfree.ai | g4f.Provider.ChatgptFree |
✔️ | ❌ | ❌ | ❌ | |
chatgptlogin.ai | g4f.Provider.ChatgptLogin |
✔️ | ❌ | ✔️ | ❌ | |
cromicle.top | g4f.Provider.Cromicle |
✔️ | ❌ | ✔️ | ❌ | |
gptgod.site | g4f.Provider.GptGod |
✔️ | ❌ | ✔️ | ❌ | |
opchatgpts.net | g4f.Provider.Opchatgpts |
✔️ | ❌ | ✔️ | ❌ | |
chat.ylokh.xyz | g4f.Provider.Ylokh |
✔️ | ❌ | ✔️ | ❌ |
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
bard.google.com | g4f.Provider.Bard |
❌ | ❌ | ❌ | ✔️ | |
deepinfra.com | g4f.Provider.DeepInfra |
❌ | ❌ | ✔️ | ❌ | |
huggingface.co | g4f.Provider.HuggingChat |
❌ | ❌ | ✔️ | ✔️ | |
www.llama2.ai | g4f.Provider.Llama2 |
❌ | ❌ | ✔️ | ❌ | |
open-assistant.io | g4f.Provider.OpenAssistant |
❌ | ❌ | ✔️ | ✔️ |
Model | Base Provider | Provider | Website |
---|---|---|---|
palm | g4f.Provider.Bard | bard.google.com | |
h2ogpt-gm-oasst1-en-2048-falcon-7b-v3 | Hugging Face | g4f.Provider.H2o | www.h2o.ai |
h2ogpt-gm-oasst1-en-2048-falcon-40b-v1 | Hugging Face | g4f.Provider.H2o | www.h2o.ai |
h2ogpt-gm-oasst1-en-2048-open-llama-13b | Hugging Face | g4f.Provider.H2o | www.h2o.ai |
claude-instant-v1 | Anthropic | g4f.Provider.Vercel | sdk.vercel.ai |
claude-v1 | Anthropic | g4f.Provider.Vercel | sdk.vercel.ai |
claude-v2 | Anthropic | g4f.Provider.Vercel | sdk.vercel.ai |
command-light-nightly | Cohere | g4f.Provider.Vercel | sdk.vercel.ai |
command-nightly | Cohere | g4f.Provider.Vercel | sdk.vercel.ai |
gpt-neox-20b | Hugging Face | g4f.Provider.Vercel | sdk.vercel.ai |
oasst-sft-1-pythia-12b | Hugging Face | g4f.Provider.Vercel | sdk.vercel.ai |
oasst-sft-4-pythia-12b-epoch-3.5 | Hugging Face | g4f.Provider.Vercel | sdk.vercel.ai |
santacoder | Hugging Face | g4f.Provider.Vercel | sdk.vercel.ai |
bloom | Hugging Face | g4f.Provider.Vercel | sdk.vercel.ai |
flan-t5-xxl | Hugging Face | g4f.Provider.Vercel | sdk.vercel.ai |
code-davinci-002 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
gpt-3.5-turbo-16k | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
gpt-3.5-turbo-16k-0613 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
gpt-4-0613 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
text-ada-001 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
text-babbage-001 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
text-curie-001 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
text-davinci-002 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
text-davinci-003 | OpenAI | g4f.Provider.Vercel | sdk.vercel.ai |
llama13b-v2-chat | Replicate | g4f.Provider.Vercel | sdk.vercel.ai |
llama7b-v2-chat | Replicate | g4f.Provider.Vercel | sdk.vercel.ai |
官方原文档,尊重敬重各位大佬gpt4free的提供开源者,如有侵权行为,请联系我关闭下架,官方原文档我就合起来了,有需要看的自行咱开或者前往官方github:[gpt4free](https://github.com/xtekky/gpt4free) 浏览
Written by @xtekky & maintained by @hlohaus
By using this repository or any code related to it, you agree to the legal notice. The author is not responsible for the usage of this repository nor endorses it, nor is the author responsible for any copies, forks, re-uploads made by other users, or anything else related to GPT4Free. This is the author's only account and repository. To prevent impersonation or irresponsible actions, please comply with the GNU GPL license this Repository uses.
[!Warning] "gpt4free" serves as a PoC (proof of concept), demonstrating the development of an API package with multi-provider requests, with features like timeouts, load balance and flow control.
pip install -U g4f
docker pull hlohaus789/g4f
- Installation Guide for Windows (.exe): 💻 #installation-guide-for-windows
- Join our Telegram Channel: 📨 telegram.me/g4f_channel
- Join our Discord Group: 💬 discord.gg/XfybzPXPH5
g4f
now supports 100% local inference: 🧠 local-docs
Is your site on this repository and you want to take it down? Send an email to [email protected] with proof it is yours and it will be removed as fast as possible. To prevent reproduction please secure your API. 😉
You can always leave some feedback here: https://forms.gle/FeWV9RLEedfdkmFN6
As per the survey, here is a list of improvements to come
- Update the repository to include the new openai library syntax (ex:
Openai()
class) | completed, useg4f.client.Client
- Golang implementation
- 🚧 Improve Documentation (in /docs & Guides, Howtos, & Do video tutorials)
- Improve the provider status list & updates
- Tutorials on how to reverse sites to write your own wrapper (PoC only ofc)
- Improve the Bing wrapper. (Wait and Retry or reuse conversation)
- 🚧 Write a standard provider performance test to improve the stability
- Potential support and development of local models
- 🚧 Improve compatibility and error handling
- 🆕 What's New
- 📚 Table of Contents
- 🛠️ Getting Started
- 💡 Usage
- 🚀 Providers and Models
- 🔗 Powered by gpt4free
- 🤝 Contribute
- 🙌 Contributors
- ©️ Copyright
- ⭐ Star History
- 📄 License
-
Install Docker: Begin by downloading and installing Docker.
-
Set Up the Container: Use the following commands to pull the latest image and start the container:
docker pull hlohaus789/g4f
docker run -p 8080:8080 -p 1337:1337 -p 7900:7900 --shm-size="2g" -v ${PWD}/hardir:/app/hardir hlohaus789/g4f:latest
-
Access the Client:
- To use the included client, navigate to: https://localhost:8080/chat/
- Or set the API base for your client to: https://localhost:1337/v1
-
(Optional) Provider Login: If required, you can access the container's desktop here: https://localhost:7900/?autoconnect=1&resize=scale&password=secret for provider login purposes.
To ensure the seamless operation of our application, please follow the instructions below. These steps are designed to guide you through the installation process on Windows operating systems.
- WebView2 Runtime: Our application requires the WebView2 Runtime to be installed on your system. If you do not have it installed, please download and install it from the Microsoft Developer Website. If you already have WebView2 Runtime installed but are encountering issues, navigate to Installed Windows Apps, select WebView2, and opt for the repair option.
- Download the Application: Visit our latest releases page and download the most recent version of the application, named
g4f.webview.*.exe
. - File Placement: Once downloaded, transfer the
.exe
file from your downloads folder to a directory of your choice on your system, and then execute it to run the app.
- Firewall Configuration (Hotfix): Upon installation, it may be necessary to adjust your Windows Firewall settings to allow the application to operate correctly. To do this, access your Windows Firewall settings and allow the application.
By following these steps, you should be able to successfully install and run the application on your Windows system. If you encounter any issues during the installation process, please refer to our Issue Tracker or try to get contact over Discord for assistance.
Run the Webview UI on other Platfroms:
Run the Web UI on Your Smartphone:
- Download and install Python (Version 3.10+ is recommended).
- Install Google Chrome for providers with webdriver
pip install -U g4f[all]
How do I install only parts or do disable parts? Use partial requirements: /docs/requirements
How do I load the project using git and installing the project requirements? Read this tutorial and follow it step by step: /docs/git
How do I build and run composer image from source? Use docker-compose: /docs/docker
from g4f.client import Client
client = Client()
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello"}],
...
)
print(response.choices[0].message.content)
Hello! How can I assist you today?
from g4f.client import Client
client = Client()
response = client.images.generate(
model="gemini",
prompt="a white siamese cat",
...
)
image_url = response.data[0].url
Full Documentation for Python API
- New AsyncClient API from G4F: /docs/async_client
- Client API like the OpenAI Python library: /docs/client
- Legacy API with python modules: /docs/legacy
To start the web interface, type the following codes in python:
from g4f.gui import run_gui
run_gui()
or execute the following command:
python -m g4f.cli gui -port 8080 -debug
You can use the Interference API to serve other OpenAI integrations with G4F.
See docs: /docs/interference
Access with: https://localhost:1337/v1
You need cookies for BingCreateImages and the Gemini Provider.
From Bing you need the "_U" cookie and from Gemini you need the "__Secure-1PSID" cookie.
Sometimes you doesn't need the "__Secure-1PSID" cookie, but some other auth cookies.
You can pass the cookies in the create function or you use the set_cookies
setter before you run G4F:
from g4f.cookies import set_cookies
set_cookies(".bing.com", {
"_U": "cookie value"
})
set_cookies(".google.com", {
"__Secure-1PSID": "cookie value"
})
...
To utilize the OpenaiChat provider, a .har file is required from https://chat.openai.com/. Follow the steps below to create a valid .har file:
- Navigate to https://chat.openai.com/ using your preferred web browser and log in with your credentials.
- Access the Developer Tools in your browser. This can typically be done by right-clicking the page and selecting "Inspect," or by pressing F12 or Ctrl+Shift+I (Cmd+Option+I on a Mac).
- With the Developer Tools open, switch to the "Network" tab.
- Reload the website to capture the loading process within the Network tab.
- Initiate an action in the chat which can be captured in the .har file.
- Right-click any of the network activities listed and select "Save all as HAR with content" to export the .har file.
- Place the exported .har file in the
./hardir
directory if you are using Docker. Alternatively, you can store it in any preferred location within your current working directory.
Note: Ensure that your .har file is stored securely, as it may contain sensitive information.
If you want to hide or change your IP address for the providers, you can set a proxy globally via an environment variable:
- On macOS and Linux:
export G4F_PROXY="https://host:port"
- On Windows:
set G4F_PROXY=https://host:port
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
bing.com | g4f.Provider.Bing |
❌ | ✔️ | ✔️ | ❌ | |
chatgpt.ai | g4f.Provider.ChatgptAi |
❌ | ✔️ | ✔️ | ❌ | |
liaobots.site | g4f.Provider.Liaobots |
✔️ | ✔️ | ✔️ | ❌ | |
chat.openai.com | g4f.Provider.OpenaiChat |
✔️ | ❌ | ✔️ | ✔️ | |
raycast.com | g4f.Provider.Raycast |
✔️ | ✔️ | ✔️ | ✔️ | |
beta.theb.ai | g4f.Provider.Theb |
✔️ | ✔️ | ✔️ | ❌ | |
you.com | g4f.Provider.You |
✔️ | ✔️ | ✔️ | ❌ |
While we wait for gpt-5, here is a list of new models that are at least better than gpt-3.5-turbo. Some are better than gpt-4. Expect this list to grow.
Website | Provider | parameters | better than |
---|---|---|---|
mixtral-8x22b | g4f.Provider.DeepInfra |
176B / 44b active | gpt-3.5-turbo |
dbrx-instruct | g4f.Provider.DeepInfra |
132B / 36B active | gpt-3.5-turbo |
command-r+ | g4f.Provider.HuggingChat |
104B | gpt-4-0613 |
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
chat3.aiyunos.top | g4f.Provider.AItianhuSpace |
✔️ | ❌ | ✔️ | ❌ | |
chatforai.store | g4f.Provider.ChatForAi |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt4online.org | g4f.Provider.Chatgpt4Online |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt-free.cc | g4f.Provider.ChatgptNext |
✔️ | ❌ | ✔️ | ❌ | |
chatgptx.de | g4f.Provider.ChatgptX |
✔️ | ❌ | ✔️ | ❌ | |
flowgpt.com | g4f.Provider.FlowGpt |
✔️ | ❌ | ✔️ | ❌ | |
freegptsnav.aifree.site | g4f.Provider.FreeGpt |
✔️ | ❌ | ✔️ | ❌ | |
gpttalk.ru | g4f.Provider.GptTalkRu |
✔️ | ❌ | ✔️ | ❌ | |
koala.sh | g4f.Provider.Koala |
✔️ | ❌ | ✔️ | ❌ | |
app.myshell.ai | g4f.Provider.MyShell |
✔️ | ❌ | ✔️ | ❌ | |
perplexity.ai | g4f.Provider.PerplexityAi |
✔️ | ❌ | ✔️ | ❌ | |
poe.com | g4f.Provider.Poe |
✔️ | ❌ | ✔️ | ✔️ | |
talkai.info | g4f.Provider.TalkAi |
✔️ | ❌ | ✔️ | ❌ | |
chat.vercel.ai | g4f.Provider.Vercel |
✔️ | ❌ | ✔️ | ❌ | |
aitianhu.com | g4f.Provider.AItianhu |
✔️ | ❌ | ✔️ | ❌ | |
chatgpt.bestim.org | g4f.Provider.Bestim |
✔️ | ❌ | ✔️ | ❌ | |
chatbase.co | g4f.Provider.ChatBase |
✔️ | ❌ | ✔️ | ❌ | |
chatgptdemo.info | g4f.Provider.ChatgptDemo |
✔️ | ❌ | ✔️ | ❌ | |
chat.chatgptdemo.ai | g4f.Provider.ChatgptDemoAi |
✔️ | ❌ | ✔️ | ❌ | |
chatgptfree.ai | g4f.Provider.ChatgptFree |
✔️ | ❌ | ❌ | ❌ | |
chatgptlogin.ai | g4f.Provider.ChatgptLogin |
✔️ | ❌ | ✔️ | ❌ | |
chat.3211000.xyz | g4f.Provider.Chatxyz |
✔️ | ❌ | ✔️ | ❌ | |
gpt6.ai | g4f.Provider.Gpt6 |
✔️ | ❌ | ✔️ | ❌ | |
gptchatly.com | g4f.Provider.GptChatly |
✔️ | ❌ | ❌ | ❌ | |
ai18.gptforlove.com | g4f.Provider.GptForLove |
✔️ | ❌ | ✔️ | ❌ | |
gptgo.ai | g4f.Provider.GptGo |
✔️ | ❌ | ✔️ | ❌ | |
gptgod.site | g4f.Provider.GptGod |
✔️ | ❌ | ✔️ | ❌ | |
onlinegpt.org | g4f.Provider.OnlineGpt |
✔️ | ❌ | ✔️ | ❌ |
Website | Provider | GPT-3.5 | GPT-4 | Stream | Status | Auth |
---|---|---|---|---|---|---|
openchat.team | g4f.Provider.Aura |
❌ | ❌ | ✔️ | ❌ | |
bard.google.com | g4f.Provider.Bard |
❌ | ❌ | ❌ | ✔️ | |
deepinfra.com | g4f.Provider.DeepInfra |
❌ | ❌ | ✔️ | ❌ | |
free.chatgpt.org.uk | g4f.Provider.FreeChatgpt |
❌ | ❌ | ✔️ | ❌ | |
gemini.google.com | g4f.Provider.Gemini |
❌ | ❌ | ✔️ | ✔️ | |
ai.google.dev | g4f.Provider.GeminiPro |
❌ | ❌ | ✔️ | ✔️ | |
gemini-chatbot-sigma.vercel.app | g4f.Provider.GeminiProChat |
❌ | ❌ | ✔️ | ❌ | |
huggingface.co | g4f.Provider.HuggingChat |
❌ | ❌ | ✔️ | ❌ | |
huggingface.co | g4f.Provider.HuggingFace |
❌ | ❌ | ✔️ | ❌ | |
llama2.ai | g4f.Provider.Llama2 |
❌ | ❌ | ✔️ | ❌ | |
labs.perplexity.ai | g4f.Provider.PerplexityLabs |
❌ | ❌ | ✔️ | ❌ | |
pi.ai | g4f.Provider.Pi |
❌ | ❌ | ✔️ | ❌ | |
theb.ai | g4f.Provider.ThebApi |
❌ | ❌ | ❌ | ✔️ | |
open-assistant.io | g4f.Provider.OpenAssistant |
❌ | ❌ | ✔️ | ✔️ |
Model | Base Provider | Provider | Website |
---|---|---|---|
gpt-3.5-turbo | OpenAI | 5+ Providers | openai.com |
gpt-4 | OpenAI | 2+ Providers | openai.com |
gpt-4-turbo | OpenAI | g4f.Provider.Bing | openai.com |
Llama-2-7b-chat-hf | Meta | 2+ Providers | llama.meta.com |
Llama-2-13b-chat-hf | Meta | 2+ Providers | llama.meta.com |
Llama-2-70b-chat-hf | Meta | 3+ Providers | llama.meta.com |
CodeLlama-34b-Instruct-hf | Meta | 2+ Providers | llama.meta.com |
CodeLlama-70b-Instruct-hf | Meta | 2+ Providers | llama.meta.com |
Mixtral-8x7B-Instruct-v0.1 | Huggingface | 4+ Providers | huggingface.co |
Mistral-7B-Instruct-v0.1 | Huggingface | 4+ Providers | huggingface.co |
dolphin-2.6-mixtral-8x7b | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
lzlv_70b_fp16_hf | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
airoboros-70b | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
airoboros-l2-70b-gpt4-1.4.1 | Huggingface | g4f.Provider.DeepInfra | huggingface.co |
openchat_3.5 | Huggingface | 2+ Providers | huggingface.co |
gemini | g4f.Provider.Gemini | gemini.google.com | |
gemini-pro | 2+ Providers | gemini.google.com | |
claude-v2 | Anthropic | 1+ Providers | anthropic.com |
claude-3-opus | Anthropic | g4f.Provider.You | anthropic.com |
claude-3-sonnet | Anthropic | g4f.Provider.You | anthropic.com |
pi | Inflection | g4f.Provider.Pi | inflection.ai |
🎁 Projects | ⭐ Stars | 📚 Forks | 🛎 Issues | 📬 Pull requests |
gpt4free | ||||
gpt4free-ts | ||||
Free AI API's & Potential Providers List | ||||
ChatGPT-Clone | ||||
Ai agent | ||||
ChatGpt Discord Bot | ||||
chatGPT-discord-bot | ||||
Nyx-Bot (Discord) | ||||
LangChain gpt4free | ||||
ChatGpt Telegram Bot | ||||
ChatGpt Line Bot | ||||
Action Translate Readme | ||||
Langchain Document GPT | ||||
python-tgpt |
We welcome contributions from the community. Whether you're adding new providers or features, or simply fixing typos and making small improvements, your input is valued. Creating a pull request is all it takes – our co-pilot will handle the code review process. Once all changes have been addressed, we'll merge the pull request into the main branch and release the updates at a later time.
- Read: /docs/guides/help_me
A list of all contributors is available here
- The
Vercel.py
file contains code from vercel-llm-api by @ading2210 - The
har_file.py
has input from xqdoo00o/ChatGPT-to-API - The
PerplexityLabs.py
has input from nathanrchn/perplexityai - The
Gemini.py
has input from dsdanielpark/Gemini-API
Having input implies that the AI's code generation utilized it as one of many sources.
This program is licensed under the GNU GPL v3
xtekky/gpt4free: Copyright (C) 2023 xtekky
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
This project is licensed under GNU_GPL_v3.0. |