Skip to content

Simple demo for chatting with a PDF - and optionally point the RAG implementation to a local LLM

License

Notifications You must be signed in to change notification settings

thinktecture-labs/rag-chat-with-pdf-local-llm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chat with your PDF documents - optionally with a local LLM

Installation

pip install -r requirements.txt

pip install -U pydantic==1.10.9

Run it

streamlit run chat.py

Running a local LLM

Easiest way to run a local LLM is to use LM Studio: https://lmstudio.ai/

The LLM I use in my conference talks (works fine on a MBP M1 Max with 64GB RAM):

About

Simple demo for chatting with a PDF - and optionally point the RAG implementation to a local LLM

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages