This repo contains a variety of standalone examples using the MLX framework.
The MNIST example is a good starting point to learn how to use MLX.
Some more useful examples include:
- Transformer language model training.
- Large scale text generation with LLaMA, Mistral or Phi.
- Mixture-of-experts (MoE) language model with Mixtral 8x7B
- Parameter efficient fine-tuning with LoRA.
- Generating images with Stable Diffusion.
- Speech recognition with OpenAI's Whisper.
- Bidirectional language understanding with BERT
- Semi-supervised learning on graph-structured data with GCN.
Check out the contribution guidelines for more information on contributing to this repo.