Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: How to build multiple as_chat_engine from multiple documents? #13772

Open
1 task done
Ninlawat-Puhu opened this issue May 28, 2024 · 1 comment
Open
1 task done
Labels
question Further information is requested

Comments

@Ninlawat-Puhu
Copy link

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

I have 500 documents and want to use as_chat_engine to keep history chat in each session from different users . The model can choose correct indexs from user questions with valid answer. Could you recommend or suggest how to write program.

@Ninlawat-Puhu Ninlawat-Puhu added the question Further information is requested label May 28, 2024
Copy link

dosubot bot commented May 28, 2024

To build multiple as_chat_engine instances from multiple documents and maintain chat history for each session from different users, you can follow the example below. This program will create a chat engine for each user session, maintain the chat history, and allow the model to provide valid answers based on the user's questions.

Here's a sample implementation:

from llama_index.core.base.llms.types import ChatMessage
from llama_index.core.llms.llm import LLM
from llama_index.core.memory import ChatMemoryBuffer
from llama_index.core.chat_engine.condense_plus_context import CondensePlusContextChatEngine
from llama_index.core.indices.base_retriever import BaseRetriever

# Mock LLM class for demonstration purposes
class MockLLM(LLM):
    def chat(self, messages):
        return ChatMessage(content="This is a mock response.", role="assistant")

    async def achat(self, messages):
        return ChatMessage(content="This is a mock async response.", role="assistant")

# Mock Retriever class for demonstration purposes
class MockRetriever(BaseRetriever):
    def retrieve(self, message):
        return []

    async def aretrieve(self, message):
        return []

# Function to create a chat engine for a user session
def create_chat_engine(user_id):
    llm = MockLLM()
    retriever = MockRetriever()
    memory = ChatMemoryBuffer()
    chat_engine = CondensePlusContextChatEngine(
        retriever=retriever,
        llm=llm,
        memory=memory
    )
    return chat_engine

# Dictionary to store chat engines for each user session
user_sessions = {}

# Function to handle user messages
def handle_user_message(user_id, message):
    if user_id not in user_sessions:
        user_sessions[user_id] = create_chat_engine(user_id)
    
    chat_engine = user_sessions[user_id]
    response = chat_engine.chat(message)
    return response.response

# Example usage
user_id = "user123"
message = "Hello, how are you?"
response = handle_user_message(user_id, message)
print(response)  # Output: This is a mock response.

In this example:

  1. A MockLLM class is created to simulate the behavior of an LLM.
  2. A MockRetriever class is created to simulate the behavior of a retriever.
  3. The create_chat_engine function initializes a CondensePlusContextChatEngine for a user session.
  4. The user_sessions dictionary stores the chat engines for each user session.
  5. The handle_user_message function processes user messages, creates a chat engine if it doesn't exist for the user, and returns the response from the chat engine.

This setup ensures that each user session maintains its own chat history and the model can provide valid answers based on the user's questions [1][2][3].

Details

About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant