Skip to content

Latest commit

 

History

History
30 lines (22 loc) · 707 Bytes

README.md

File metadata and controls

30 lines (22 loc) · 707 Bytes

LLM endpoint chat

Load a PDF file and ask questions via llama_index, LangChain and a LLM endpoint

Instructions

  • Install the requirements
pip install -r requirements.txt -U
  • Run chat_main.py script to chat with the LLM hosted endpoint.
python3 chat_main.py

or

  • Select a file from the menu or replace the default file file.pdf with the PDF you want to use.
  • Run pdf_qa_main.py script to ask questions about your pdf file via llama_index, LangChain and the hosted endpoint.
python3 pdf_qa_main.py
  • Ask any questions about the content of the PDF.


Credits:

This work was inspired by the chatPDF repo