-
Spelman College, Mathematics
- Atlanta
-
00:00
(UTC -04:00) - https://www.erdikara.com/
Block or Report
Block or report erkara
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
WebUI for Fine-Tuning and Self-hosting of Open-Source Large Language Models for Coding
Foundational Models for State-of-the-Art Speech and Text Translation
Pytorch-based framework for solving parametric constrained optimization problems, physics-informed system identification, and parametric model predictive control.
All Algorithms implemented in Python
Tools to Transform a Time Series into Features and Target a.k.a Supervised Learning
Chapyter: ChatGPT Code Interpreter in Jupyter Notebooks
[CVPR2024, Highlight] Official code for DragDiffusion
This repository is a curated collection of the most exciting and influential CVPR 2023 papers. 🔥 [Paper + Code]
An open-source framework for training large multimodal models.
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
Enables use of ChatGPT directly from the UI
This repository contains PyTorch implementation of 4 different models for classification of emotions of the speech.
Book on MATLAB with Python 🐍
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gan…
Next generation FEniCS Form Compiler for finite element forms
Optimal Control Theory's study based on Suzzane Lenhart's book. Theoretical notes and numeric simulation developed.
Monolithic Fluid-Structure Interaction (FSI) solver
The standard data-centric AI package for data quality and machine learning with messy, real-world data and labels.
NMA deep learning course
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities