-
Michigan State University
- https://cse.msu.edu/~wenhongz/
- https://orcid.org/0000-0003-0775-8538
- in/hongzhi-wen-68a15513a
Block or Report
Block or report wehos
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Must-read Papers on Knowledge Editing for Large Language Models.
Simple, minimal implementation of the Mamba SSM in one file of PyTorch.
The repository for the code of the UltraFastBERT paper
Chat Templates for 🤗 HuggingFace Large Language Models
Multiple NVIDIA GPUs or Apple Silicon for Large Language Model Inference?
1 min voice data can also be used to train a good TTS model! (few shot voice cloning)
vits2 backbone with multilingual-bert
Fine-tune the Whisper speech recognition model to support training without timestamp data, training with timestamp data, and training without speech data. Accelerate inference and support Web deplo…
Accompanied repositories for our paper Graph foundation model
Python package implementing our method MatchCLOT for multimodal single-cell data integration
This repository contains a collection of papers and resources on Reasoning in Large Language Models.
Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch
Implementation of E(n)-Transformer, which incorporates attention mechanisms into Welling's E(n)-Equivariant Graph Neural Network
Implementation of Perceiver AR, Deepmind's new long-context attention network based on Perceiver architecture, in Pytorch
scDiff: A General Single-Cell Analysis Framework via Conditional Diffusion Generative Models
Single-Cell Multimodal Prediction via Transformer
Official repo for CellPLM: Pre-training of Cell Language Model Beyond Single Cells.
[COLM 2024] LoraHub: Efficient Cross-Task Generalization via Dynamic LoRA Composition
Code for fine-tuning Platypus fam LLMs using LoRA
A high-throughput and memory-efficient inference and serving engine for LLMs
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
[EMNLP 2023] The CoT Collection: Improving Zero-shot and Few-shot Learning of Language Models via Chain-of-Thought Fine-Tuning
Dropbox Uploader is a BASH script which can be used to upload, download, list or delete files from Dropbox, an online file sharing, synchronization and backup service.
DecryptLogin: APIs for loginning some websites by using requests.
Reverse engineered API of Microsoft's Bing Chat AI