Skip to content
View Achazwl's full-sized avatar
🤔
🤔
  • Tsinghua University

Highlights

  • Pro

Block or report Achazwl

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Achazwl/README.md
  • 🌱 I'm a PhD student at Tsinghua University, THUNLP lab.
  • 🔭 I’m currently working on Efficient/Low-Resource Methods for NLP, Pre-trained Language Models, Parameter-Efficient Tuning.
  • ⚡ I’m one of the maintainers of the following open-source projects: BMTrain, ModelCenter, OpenPrompt and OpenDelta.



Pinned Loading

  1. OpenBMB/BMTrain OpenBMB/BMTrain Public

    Efficient Training (including pre-training and fine-tuning) for Big Models

    Python 541 74

  2. OpenBMB/ModelCenter OpenBMB/ModelCenter Public

    Efficient, Low-Resource, Distributed transformer implementation based on BMTrain

    Python 230 28

  3. thunlp/OpenPrompt thunlp/OpenPrompt Public

    An Open-Source Framework for Prompt-Learning.

    Python 4.3k 441

  4. thunlp/OpenDelta thunlp/OpenDelta Public

    A plug-and-play library for parameter-efficient-tuning (Delta Tuning)

    Python 970 78

  5. OpenBMB/MiniCPM OpenBMB/MiniCPM Public

    MiniCPM-2B: An end-side LLM outperforming Llama2-13B.

    Python 4.7k 334

  6. thunlp/Ouroboros thunlp/Ouroboros Public

    Ouroboros: Speculative Decoding with Large Model Enhanced Drafting

    Python 60 8