-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Page Not Found for MultiQueryRetriever Documentation #1533
Comments
Hey @mirzasreza, nice to see you around here again! Hope you're doing well. 🚀 Thank you for bringing this to our attention. It appears that the link to the MultiQueryRetriever documentation is indeed broken. I've checked the langflow codebase and found that the reference to this documentation is located in the While I couldn't find a specific reason why the documentation link is broken, or any recent commits or changes in the URL structure that could have caused this, I did find some useful information in the code itself. The Here are some key configuration details:
The The I hope this information is helpful in the absence of the specific documentation page. If you need further clarification, please let me know. Sources
|
@dosu-bot can you suggest a example workflow for MultiQueryRetriever? |
Sure, I can provide a practical example of how to use the
Here's an example code snippet demonstrating these steps: from langchain.llms import YourCustomLLM # Assuming you have a custom LLM implementation
from langchain.retrievers import YourVectorRetriever # Assuming you have a vector database retriever
from langflow.components.retrievers.MultiQueryRetriever import MultiQueryRetrieverComponent
# Initialize your LLM and Retriever
llm = YourCustomLLM(...)
retriever = YourVectorRetriever(...)
# Initialize the MultiQueryRetrieverComponent
multi_query_retriever_component = MultiQueryRetrieverComponent()
# Optionally, define a custom prompt
custom_prompt = {
"input_variables": ["question"],
"template": "As an AI, generate 5 different interpretations of the question: {question}",
"template_format": "f-string",
}
# Build the MultiQueryRetriever with the default prompt
multi_query_retriever_default = multi_query_retriever_component.build(llm=llm, retriever=retriever)
# Or, build the MultiQueryRetriever with a custom prompt
multi_query_retriever_custom = multi_query_retriever_component.build(llm=llm, retriever=retriever, prompt=custom_prompt)
# Now, multi_query_retriever_default or multi_query_retriever_custom can be used to generate multiple versions of a user question
# and retrieve relevant documents from the vector database. This example demonstrates how to initialize the The exact structure and format of the default prompt template provided by the {
"input_variables": ["question"],
"input_types": {},
"output_parser": null,
"partial_variables": {},
"template": "You are an AI language model assistant. Your task is \nto generate 3 different versions of the given user \nquestion to retrieve relevant documents from a vector database. \nBy generating multiple perspectives on the user question, \nyour goal is to help the user overcome some of the limitations \nof distance-based similarity search. Provide these alternative \nquestions separated by newlines. Original question: {question}",
"template_format": "f-string",
"validate_template": false,
"_type": "prompt"
} The exact implementation of the def build(
self,
llm: BaseLLM,
retriever: BaseRetriever,
prompt: Optional[PromptTemplate] = None,
parser_key: str = "lines",
) -> Union[Callable, MultiQueryRetriever]:
if not prompt:
return MultiQueryRetriever.from_llm(llm=llm, retriever=retriever, parser_key=parser_key)
else:
return MultiQueryRetriever.from_llm(llm=llm, retriever=retriever, prompt=prompt, parser_key=parser_key) You can find more details in the source code of the Sources
|
Describe the bug
While browsing the documentation for MultiQueryRetriever I found the link is broken and the page is not found.
https://python.langchain.com/docs/modules/data_connection/retrievers/how_to/MultiQueryRetriever
The text was updated successfully, but these errors were encountered: