mistral
Here are 349 public repositories matching this topic...
A WebUI for Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
-
Updated
Jul 19, 2024 - Python
🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
-
Updated
Jul 20, 2024 - C++
Low-code framework for building custom LLMs, neural networks, and other AI models
-
Updated
Jul 11, 2024 - Python
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
-
Updated
Jul 17, 2024 - Python
Firefly: 大模型训练工具,支持训练Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
-
Updated
Jul 16, 2024 - Python
Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.
-
Updated
Jul 20, 2024 - TypeScript
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
Updated
Jul 19, 2024 - Python
Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.
-
Updated
Jul 6, 2024 - Swift
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
-
Updated
Mar 31, 2024 - Python
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
-
Updated
Jul 11, 2024 - Rust
Lightweight inference library for ONNX files, written in C++. It can run SDXL on a RPI Zero 2 but also Mistral 7B on desktops and servers.
-
Updated
Jun 19, 2024 - C++
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
-
Updated
Jun 20, 2024 - Python
[知识编辑] [ACL 2024] An Easy-to-use Knowledge Editing Framework for LLMs.
-
Updated
Jul 19, 2024 - Jupyter Notebook
Create chatbots with ease
-
Updated
Jul 19, 2024 - TypeScript
Python SDK for agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
-
Updated
Jul 20, 2024 - Python
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
-
Updated
Jul 18, 2024 - Dart
Improve this page
Add a description, image, and links to the mistral topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the mistral topic, visit your repo's landing page and select "manage topics."