Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Chat Generators to connect to Answer Builder #7839

Open
lbux opened this issue Jun 10, 2024 · 3 comments
Open

Allow Chat Generators to connect to Answer Builder #7839

lbux opened this issue Jun 10, 2024 · 3 comments

Comments

@lbux
Copy link

lbux commented Jun 10, 2024

Is your feature request related to a problem? Please describe.
There is no current way for a ChatGenerator to directly connect to AnswerBuilder() despite similar functionality to a regular Generator.

Describe the solution you'd like
We can extract the text from a ChatMessage and use it as a str as expected in the code for AnswerBuilder.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
ChatGenerators that rely on non-open ai models require a specific chat format to be applied to properly differentiate between system/user/assistant messages. This is done manually (and prone to errors) in regular generators. This is an example I pulled from a Haystack notebook:

prompt_template = """
<|begin_of_text|><|start_header_id|>user<|end_header_id|>


Using the information contained in the context, give a comprehensive answer to the question.
If the answer cannot be deduced from the context, do not give an answer.

Context:
  {% for doc in documents %}
  {{ doc.content }} URL:{{ doc.meta['url'] }}
  {% endfor %};
  Question: {{query}}<|eot_id|>

<|start_header_id|>assistant<|end_header_id|>


"""
prompt_builder = PromptBuilder(template=prompt_template)

This can be accomplished by using a ChatGenerator as follows:

system_message = ChatMessage.from_system(
    """
    Read the context provided and answer the question if possible.If you can not form an answer from the context, reply with "Nah".
    Context:
    {% for doc in documents %}
    {{ doc.content }}
    {% endfor %};
    """
)
user_message = ChatMessage.from_user("query: {{query}}")
assistent_message = ChatMessage.from_assistant("Answer: ")

We can then add the output to an AnswerBuilder to complete a pipeline ensuring that the proper template is applied.

@lbux
Copy link
Author

lbux commented Jun 10, 2024

would also like to add that i have implemented this locally for my hyrbrid pipeline that utilizes LlamaCppChatGenerator and the new ChatPromptBuilder. Everything seems to work. It's just lacking the .meta support, but that can be added at a different time if needed.

@anakin87
Copy link
Member

@lbux What approach did you use? Did you create a custom component?

@lbux
Copy link
Author

lbux commented Jun 18, 2024

@anakin87 There is no need to write an additional component. A ChatMessage already contains the necessary information and would only require 3-4 new lines in AnswerBuilder. My quick and dirty implementation was allowing for ChatMessages or str (the default) as input. If a str is passed in, we keep the same logic that exists in AnswerBuilder. If a ChatMessage is passed in, we extract the message string and reuse the existing str logic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants