This project is a document chat application that leverages Elasticsearch for search capabilities and Ollama for natural language processing. The application is containerized using Docker and can be deployed using Docker Compose or Kubernetes. Currently, the application only supports PDF documents. The user interface is developed with Streamlit.
Ensure you have the following installed on your system:
- Docker
- Docker Compose
- Kubernetes (kubectl and a cluster)
- Python 3.11
- OpenAI API Key (Get your API key here)
-
Clone the repository:
git clone https://github.com/Jbdu4493/chat_rag cd chat_rag
-
Build and run the Docker containers:
docker-compose up --build
-
Clone the repository:
git clone https://github.com/Jbdu4493/chat_rag cd chat_rag/k8s
-
Apply the Kubernetes configurations:
kubectl apply -f configurationk8s.yaml
This installation method provides better performance for Ollama by running it directly on your host system.
-
Install Ollama from this link.
-
Start Ollama:
ollama start
-
Pull the Gemma model:
ollama pull gemma
-
Start Elasticsearch using Docker:
docker run -d --name elasticsearch \ -p 9200:9200 \ -e discovery.type="single-node" \ -e xpack.security.enabled="false" \ -e xpack.security.http.ssl.enabled="false" \ -v esdata:/usr/share/elasticsearch/data \ docker.elastic.co/elasticsearch/elasticsearch:8.13.4
-
Clone the repository:
git clone https://github.com/Jbdu4493/chat_rag cd chat_rag/k8s
-
Install Python dependencies:
pip install -r requirement.txt
-
Run the Streamlit application:
streamlit run front.py
- Once the containers are up and running, access the application at
http:https://localhost:8501
.
- The application provides a chat interface where you can interact with the document processing and search capabilities powered by Elasticsearch and Ollama. Please note that the application only supports PDF documents. The user interface is developed with Streamlit.
The application can be configured using the following environment variables:
OLLAMA_URL
: URL of the Ollama service (default:http:https://localhost:11434
)ELASTICSEARCH_URI
: URL of the Elasticsearch service (default:http:https://localhost:9200
)OPENAI_API_KEY
: Your OpenAI API key
The Docker Compose configuration is defined in docker-compose.yaml
.
The Kubernetes configuration is defined in k8s/configurationk8s.yaml
.
Contributions are welcome! Please submit a pull request or open an issue to discuss any changes.
This project is licensed under the MIT License. See the LICENSE file for details.