Flops counter for convolutional networks in pytorch framework
-
Updated
May 3, 2024 - Python
Flops counter for convolutional networks in pytorch framework
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
solo-learn: a library of self-supervised methods for visual representation learning powered by Pytorch Lightning
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
The Enterprise-Grade Production-Ready Multi-Agent Orchestration Framework Join our Community: https://discord.com/servers/agora-999382051935506503
💭 Aspect-Based-Sentiment-Analysis: Transformer & Explainable ML (TensorFlow)
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
Punctuation Restoration using Transformer Models for High-and Low-Resource Languages
The official code repo of "HTS-AT: A Hierarchical Token-Semantic Audio Transformer for Sound Classification and Detection"
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
[BMVC 2022] You Only Need 90K Parameters to Adapt Light: A Light Weight Transformer for Image Enhancement and Exposure Correction. SOTA for low light enhancement, 0.004 seconds try this for pre-processing.
The official code repo for "Zero-shot Audio Source Separation through Query-based Learning from Weakly-labeled Data", in AAAI 2022
GRIT: Faster and Better Image-captioning Transformer (ECCV 2022)
Tk-Instruct is a Transformer model that is tuned to solve many NLP tasks by following instructions.
Scikit-learn friendly library to interpret, and prompt-engineer text datasets using large language models.
pytorch下基于transformer / LSTM模型的彩票预测
Awesome datasets for Bangla language computing.
ShadowFormer (AAAI2023), Pytorch implementation
This is a neural spell checker
My implementation of "Algorithm of Thoughts: Enhancing Exploration of Ideas in Large Language Models"
Add a description, image, and links to the transformer-models topic page so that developers can more easily learn about it.
To associate your repository with the transformer-models topic, visit your repo's landing page and select "manage topics."