π‘ Get help - βFAQ πDiscussions π¬ Discord π Documentation website
LocalAI is a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.
Follow LocalAI
Connect with the Creator
Share LocalAI Repository
In a nutshell:
- Local, OpenAI drop-in alternative REST API. You own your data.
- NO GPU required. NO Internet access is required either
- Optional, GPU Acceleration is available in
llama.cpp
-compatible LLMs. See also the build section.
- Optional, GPU Acceleration is available in
- Supports multiple models
- π Once loaded the first time, it keep models loaded in memory for faster inference
- β‘ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.
LocalAI was created by Ettore Di Giacinto and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
Note that this started just as a fun weekend project in order to try to create the necessary pieces for a full AI assistant like ChatGPT
: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
π₯π₯ Hot topics / Roadmap
π Features
- π Text generation with GPTs (
llama.cpp
,gpt4all.cpp
, ... π and more) - π£ Text to Audio
- π Audio to Text (Audio transcription with
whisper.cpp
) - π¨ Image generation with stable diffusion
- π₯ OpenAI functions π
- π§ Embeddings generation for vector databases
- βοΈ Constrained grammars
- πΌοΈ Download Models directly from Huggingface
π π₯ Media, Blogs, Social
- Create a slackbot for teams and OSS projects that answer to documentation
- LocalAI meets k8sgpt
- Question Answering on Documents locally with LangChain, LocalAI, Chroma, and GPT4All
- Tutorial to use k8sgpt with LocalAI
Check out the Getting started section in our documentation.
See the documentation
Do you find LocalAI useful?
Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.
A huge thank you to our generous sponsors who support this project:
Spectro Cloud |
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs! |
LocalAI is a community-driven project created by Ettore Di Giacinto.
MIT - Author Ettore Di Giacinto
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
- llama.cpp
- https://github.com/tatsu-lab/stanford_alpaca
- https://github.com/cornelk/llama-go for the initial ideas
- https://github.com/antimatter15/alpaca.cpp
- https://github.com/EdVince/Stable-Diffusion-NCNN
- https://github.com/ggerganov/whisper.cpp
- https://github.com/saharNooby/rwkv.cpp
- https://github.com/rhasspy/piper
- https://github.com/cmp-nct/ggllm.cpp
This is a community project, a special thanks to our contributors! π€