[ICML2024] Unified Training of Universal Time Series Forecasting Transformers
-
Updated
Jul 9, 2024 - Jupyter Notebook
[ICML2024] Unified Training of Universal Time Series Forecasting Transformers
🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox
Collection of awesome parameter-efficient fine-tuning resources.
The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning"(CVPR24) in PyTorch.
dis-cyril is an Alexa like using pre-trained models and buzin.
Registration of pre-trained models found on Hugging Face using blockchain technology
A concept on how Machine Learning (ML) can be integrated on Web apps
Streamlit app that predicts if a painting is a van Gogh
Welcome to a diverse collection of hands-on ML, AI, Python, and Data Science projects. Explore and learn through practical applications. Happy coding! 🚀
The "Object Detection and Identification Model" is an AI project employing YOLO v3, a pretrained model, and Python. It enables efficient and accurate detection and identification of objects in images, showcasing the prowess of advanced computer vision technology.
Representation Abstractions as Incentives for Reinforcement Learning Agents: A Robotic Grasping Case Study
Fine-tuning GPT-2 models with custom text corpora, utilizing Hugging Face's Transformers library and advanced training techniques for sophisticated text generation applications.
Landmark Detection using pre-trained models.
Add a description, image, and links to the pre-trained-models topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-models topic, visit your repo's landing page and select "manage topics."