Skip to content

Nagi-ovo/CRAG-Ollama-Chat

Repository files navigation

CRAG Ollama Chat

create by ideogram.ai

Preview

ollama-crag.mp4

Run the demo by :

  1. Creat a config.yaml file with the format of config.example.yaml and fill in the required config:
# APIs: If you aren't using ollama
openai_api_key: "sk-"
openai_api_base: "https://api.openai.com/v1/chat/completions" # Or your own proxy
google_api_key: "your_google_api_key" # Unnecessary
tavily_api_key: "tvly-" # A must for the Websearch tools, which you can create on https://app.tavily.com/

# Ollama Config
run_local: "Yes" # Yes or No, if Yes, the you must have ollama running in ur PC
local_llm: "openhermes" # mistral, llama2 ...

# Model Config
models: "openai" # If you want to achieve the best results

# Document Config
# Support multiple websites reading
doc_url:  # My blogs right now
  - "https://nagi.fun/llm-5-transformer"  
  - "https://nagi.fun/llm-4-wavenet"  
  1. Install dependencies by poetry or pip install -r requirements.txt

  2. run the command below:

streamlit run app.py

References

About

Corrective RAG demo powerd by Ollama

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published