Skip to content

Jbdu4493/chat_rag

Repository files navigation

Document Chat Application

This project is a document chat application that leverages Elasticsearch for search capabilities and Ollama for natural language processing. The application is containerized using Docker and can be deployed using Docker Compose or Kubernetes. Currently, the application only supports PDF documents. The user interface is developed with Streamlit.

Table of Contents

Prerequisites

Ensure you have the following installed on your system:

  • Docker
  • Docker Compose
  • Kubernetes (kubectl and a cluster)
  • Python 3.11
  • OpenAI API Key (Get your API key here)

Installation

Using Docker Compose

  1. Clone the repository:

    git clone https://github.com/Jbdu4493/chat_rag
    cd chat_rag
  2. Build and run the Docker containers:

    docker-compose up --build

Using Kubernetes

  1. Clone the repository:

    git clone https://github.com/Jbdu4493/chat_rag
    cd chat_rag/k8s
  2. Apply the Kubernetes configurations:

    kubectl apply -f configurationk8s.yaml

Direct Installation for Better Performance

This installation method provides better performance for Ollama by running it directly on your host system.

  1. Install Ollama from this link.

  2. Start Ollama:

    ollama start
  3. Pull the Gemma model:

    ollama pull gemma
  4. Start Elasticsearch using Docker:

    docker run -d --name elasticsearch \
               -p 9200:9200 \
               -e discovery.type="single-node" \
               -e xpack.security.enabled="false" \
               -e xpack.security.http.ssl.enabled="false" \
               -v esdata:/usr/share/elasticsearch/data \
               docker.elastic.co/elasticsearch/elasticsearch:8.13.4
  5. Clone the repository:

    git clone https://github.com/Jbdu4493/chat_rag
    cd chat_rag/k8s
  6. Install Python dependencies:

    pip install -r requirement.txt
  7. Run the Streamlit application:

    streamlit run front.py

Usage

Accessing the Application

  • Once the containers are up and running, access the application at http:https://localhost:8501.

Interacting with the Application

  • The application provides a chat interface where you can interact with the document processing and search capabilities powered by Elasticsearch and Ollama. Please note that the application only supports PDF documents. The user interface is developed with Streamlit.

Configuration

Environment Variables

The application can be configured using the following environment variables:

  • OLLAMA_URL: URL of the Ollama service (default: http:https://localhost:11434)
  • ELASTICSEARCH_URI: URL of the Elasticsearch service (default: http:https://localhost:9200)
  • OPENAI_API_KEY: Your OpenAI API key

Docker Compose Configuration

The Docker Compose configuration is defined in docker-compose.yaml.

Kubernetes Configuration

The Kubernetes configuration is defined in k8s/configurationk8s.yaml.

Contributing

Contributions are welcome! Please submit a pull request or open an issue to discuss any changes.

License

This project is licensed under the MIT License. See the LICENSE file for details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published