-
The Hong Kong University of Science and Technology
- Hong Kong SAR, China
-
13:52
(UTC +08:00) - https://yjiangcm.github.io/
- @Yuxin_Jiang_
- https://scholar.google.com/citations?user=QnfcEEcAAAAJ
Block or Report
Block or report YJiangcm
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
User-friendly Desktop Client App for AI Models/LLMs (GPT, Claude, Gemini, Ollama...)
Instruct-tune LLaMA on consumer hardware
Code and documentation to train Stanford's Alpaca models, and generate the data.
Dataset and code for “Going on a vacation” takes longer than “Going for a walk”: A Study of Temporal Commonsense Understanding, EMNLP 2019.
PyTorch implementation of SimSiam https//arxiv.org/abs/2011.10566
Code for our paper "WordNet-Enhanced Dual Multi-head Co-Attention for Reading Comprehension of Abstract Meaning" in SemEval 2021
Code for "Global and Local Hierarchy-aware Contrastive Framework for Hierarchical Implicit Discourse Relation Recognition (ACL 2023)"
Exploiting Global and Local Hierarchies for Hierarchical Text Classification
This repository implements a prompt tuning model for hierarchical text classification. This work has been accepted as the long paper "HPT: Hierarchy-aware Prompt Tuning for Hierarchical Text Classi…
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
Code for paper: "Coherent Hierarchical Multi-Label Classification Networks"
DAMO-ConvAI: The official repository which contains the codebase for Alibaba DAMO Conversational AI.
Hierarchy-Aware Global Model for Hierarchical Text Classification
Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".
A Python interface to the Penn Discourse Treebank 2
ICML 2022: Learning Iterative Reasoning through Energy Minimization
总结梳理自然语言处理工程师(NLP)需要积累的各方面知识,包括面试题,各种基础知识,工程能力等等,提升核心竞争力
Pytorch version of BERT-whitening
Quality Controlled Paraphrase Generation (ACL 2022)
Source code of the paper "Do Syntax Trees Help Pre-trained Transformers Extract Information?" (EACL 2021)
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
🚀 State-of-the-art parsers for natural language.
Code for 'A Label Dependence-aware Sequence Generation Model for Multi-level Implicit Discourse Relation Recognition (AAAI 2022)'
This repository implements a contrastive learning model for hierarchical text classification. This work has been accepted as the long paper "Incorporating Hierarchy into Text Encoder: a Contrastive…
The official code for PRIMERA: Pyramid-based Masked Sentence Pre-training for Multi-document Summarization