-
Meta AI
- Menlo Park, CA
- http:https://ronghanghu.com/
Block or Report
Block or report ronghanghu
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
A PyTorch implementation of Connected Components Labeling
[CVPR2024 Highlight]GLEE: General Object Foundation Model for Images and Videos at Scale
Monocular Depth Estimation Toolbox based on MMSegmentation.
Kernl lets you run PyTorch transformer models several times faster on GPU with a single line of code, and is designed to be easily hackable.
Model parallel transformers in JAX and Haiku
Flax is a neural network library for JAX that is designed for flexibility.
Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry lead…
2nd solution of ICDAR 2021 Competition on Scientific Literature Parsing, Task B.
Making large AI models cheaper, faster and more accessible
JAX - A curated list of resources https://github.com/google/jax
ConvMAE: Masked Convolution Meets Masked Autoencoders
A paper list of some recent Transformer-based CV works.
torch-optimizer -- collection of optimizers for Pytorch
Scalable PaLM implementation of PyTorch
TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale.
Code & Models for 3DETR - an End-to-end transformer model for 3D object detection
functorch is JAX-like composable function transforms for PyTorch.
uploadcare / pillow-simd
Forked from python-pillow/PillowThe friendly PIL fork
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Ongoing research training transformer models at scale
Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021 Oral.