Block or Report
Block or report wangjiajun0806
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language: Python
Sort by: Most stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
The Ethereum Improvement Proposal repository
A framework for few-shot evaluation of language models.
StyleTTS 2: Towards Human-Level Text-to-Speech through Style Diffusion and Adversarial Training with Large Speech Language Models
Run Mixtral-8x7B models in Colab or consumer desktops
Measuring Massive Multitask Language Understanding | ICLR 2021
Landmark Attention: Random-Access Infinite Context Length for Transformers
[ACL'24] Data and code for L-Eval, a comprehensive long context language models evaluation benchmark
Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718
Dataset and code for EMNLP2020 paper "HybridQA: A Dataset of Multi-Hop Question Answeringover Tabular and Textual Data"
Positional Skip-wise Training for Efficient Context Window Extension of LLMs to Extremely Length (ICLR 2024)
[ACL 2024] Benchmarking the Hallucination of Chinese Large Language Models via Unconstrained Generation