Chat locally using leading open models built by the community, optimized and accelerated by NVIDIA's enterprise-ready inference runtime
-
Updated
Jul 25, 2024 - Python
Chat locally using leading open models built by the community, optimized and accelerated by NVIDIA's enterprise-ready inference runtime
Scalable implementation of Semantic search and LLM powered chat bot for online store
Add a description, image, and links to the llm-chat topic page so that developers can more easily learn about it.
To associate your repository with the llm-chat topic, visit your repo's landing page and select "manage topics."