Open-source AI-powered search engine. (Perplexity Clone)
Run your local LLM (llama3, gemma, mistral, phi3) or use cloud models (Groq/Llama3, OpenAI/gpt4-o)
Demo answering questions with phi3 on my M1 Macbook Pro:
local-demo.mp4
Please feel free to contact me on Twitter or create an issue if you have any questions.
farfalle.dev (Cloud models only)
- π οΈ Tech Stack
- ππΏββοΈ Getting Started
- π Deploy
- Add support for local LLMs through Ollama
- Docker deployment setup
- Add support for searxng. Eliminates the need for external dependencies.
- Create a pre-built Docker Image
- Chat History
- Chat with local files
- Frontend: Next.js
- Backend: FastAPI
- Search API: SearXNG, Tavily, Serper, Bing
- Logging: Logfire
- Rate Limiting: Redis
- Components: shadcn/ui
- Search with multiple search providers (Tavily, Searxng, Serper, Bing)
- Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
- Answer questions with local models (llama3, mistral, gemma, phi3)
- Docker
- Ollama (If running local models)
- Download any of the supported models: llama3, mistral, gemma, phi3
- Start ollama server
ollama serve
docker run \
-p 8000:8000 -p 3000:3000 -p 8080:8080 \
--add-host=host.docker.internal:host-gateway \
ghcr.io/rashadphz/farfalle:main
OPENAI_API_KEY
: Your OpenAI API key. Not required if you are using Ollama.SEARCH_PROVIDER
: The search provider to use. Can betavily
,serper
,bing
, orsearxng
.OPENAI_API_KEY
: Your OpenAI API key. Not required if you are using Ollama.TAVILY_API_KEY
: Your Tavily API key.SERPER_API_KEY
: Your Serper API key.BING_API_KEY
: Your Bing API key.GROQ_API_KEY
: Your Groq API key.SEARXNG_BASE_URL
: The base URL for the SearXNG instance.
Add any env variable to the docker run command like so:
docker run \
-e ENV_VAR_NAME1='YOUR_ENV_VAR_VALUE1' \
-e ENV_VAR_NAME2='YOUR_ENV_VAR_VALUE2' \
-p 8000:8000 -p 3000:3000 -p 8080:8080 \
--add-host=host.docker.internal:host-gateway \
ghcr.io/rashadphz/farfalle:main
Wait for the app to start then visit https://localhost:3000.
or follow the instructions below to clone the repo and run the app locally
git clone [email protected]:rashadphz/farfalle.git
cd farfalle
touch .env
Add the following variables to the .env file:
You can use Tavily, Searxng, Serper, or Bing as the search provider.
Searxng (No API Key Required)
SEARCH_PROVIDER=searxng
Tavily (Requires API Key)
TAVILY_API_KEY=...
SEARCH_PROVIDER=tavily
Serper (Requires API Key)
SERPER_API_KEY=...
SEARCH_PROVIDER=serper
Bing (Requires API Key)
BING_API_KEY=...
SEARCH_PROVIDER=bing
# Cloud Models
OPENAI_API_KEY=...
GROQ_API_KEY=...
This requires Docker Compose version 2.22.0 or later.
docker-compose -f docker-compose.dev.yaml up -d
Visit https://localhost:3000 to view the app.
For custom setup instructions, see custom-setup-instructions.md
After the backend is deployed, copy the web service URL to your clipboard. It should look something like: https://some-service-name.onrender.com.
Use the copied backend URL in the NEXT_PUBLIC_API_URL
environment variable when deploying with Vercel.
And you're done! π₯³
To use Farfalle as your default search engine, follow these steps:
- Visit the settings of your browser
- Go to 'Search Engines'
- Create a new search engine entry using this URL: https://localhost:3000/?q=%s.
- Add the search engine.