Skip to content

Commit

Permalink
Fixed no chunks crash bug
Browse files Browse the repository at this point in the history
  • Loading branch information
patw committed Nov 1, 2023
1 parent 009788a commit 85b225a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion app.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ def get_rag(question, search_k, search_score_cut, llm_prompt, llm_system, llm_te
# Oh no! We have no chunks. Just return a generic "we can't help you"
# Score cut offs really help prevent LLM abuse. This is your first guardrail.
if answers == "":
return {"input": "no chunks found", "output": "No data was found to answer this question"}
return {"input": "no chunks found", "output": "No data was found to answer this question", "chunks": {}}

# Replace the template tokens with the question and the answers
prompt = llm_prompt.replace("%q%", question)
Expand Down

0 comments on commit 85b225a

Please sign in to comment.