Stars
Transformers with Arbitrarily Large Context
A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
Hamilton helps data scientists and engineers define testable, modular, self-documenting dataflows, that encode lineage/tracing and metadata. Runs and scales everywhere python does.
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Attempt at reproducing a SGNN's projection layer, but with word n-grams instead of skip-grams. Paper and more: https://aclweb.org/anthology/D18-1105
Mesh TensorFlow: Model Parallelism Made Easier
Code for the AAAI 2018 publication "SEE: Towards Semi-Supervised End-to-End Scene Text Recognition"
🔥🔥A MXNet implementation of DenseNet(with BC structure)🔥🔥
Sparse matrix-matrix multiplication on CPU+GPU systems.
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
LowRankModels.jl is a julia package for modeling and fitting generalized low rank models.