Skip to content

A code sample that shows how to use 🦜️🔗langchain, 🦙llama_index and a hosted LLM endpoint to do a standard chat or Q&A about a pdf document

Notifications You must be signed in to change notification settings

ishanshah9/octoml-llm-qa

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM endpoint chat

Load a PDF file and ask questions via llama_index, LangChain and a LLM endpoint

Instructions

  • Install the requirements
pip install -r requirements.txt -U
  • Run chat_main.py script to chat with the LLM hosted endpoint.
python3 chat_main.py

or

  • Select a file from the menu or replace the default file file.pdf with the PDF you want to use.
  • Run pdf_qa_main.py script to ask questions about your pdf file via llama_index, LangChain and the hosted endpoint.
python3 pdf_qa_main.py
  • Ask any questions about the content of the PDF.


Credits:

This work was inspired by the chatPDF repo

About

A code sample that shows how to use 🦜️🔗langchain, 🦙llama_index and a hosted LLM endpoint to do a standard chat or Q&A about a pdf document

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%