-
amidiro.ai
- Aachen, Germany
- https://amidiro.ai
Block or Report
Block or report h2stein
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language: Python
Sort by: Most stars
The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface.
A Gradio web UI for Large Language Models.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Streamlit — A faster way to build and share data apps.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
Open-Sora: Democratizing Efficient Video Production for All
Data validation using Python type hints
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your d…
Convert PDF to markdown quickly with high accuracy
Finetune Llama 3, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference,…
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Flax is a neural network library for JAX that is designed for flexibility.
Streamlined interface for generating images with AI in Krita. Inpaint and outpaint with optional text prompt, no tweaking required.
Tools for merging pretrained large language models.
Foundational model for human-like, expressive TTS
Code examples and resources for DBRX, a large language model developed by Databricks
Code for the paper "Evaluating Large Language Models Trained on Code"
Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch
A principled instruction benchmark on formulating effective queries and prompts for large language models (LLMs). Our paper: https://arxiv.org/abs/2312.16171
Common interface for interacting with AI agents. The protocol is tech stack agnostic - you can use it with any framework for building agents.
utilities for decoding deep representations (like sentence embeddings) back to text
[ACL 2024] Official PyTorch code for extracting features and training downstream models with emotion2vec: Self-Supervised Pre-Training for Speech Emotion Representation