Skip to content

Code for "Global and Local Hierarchy-aware Contrastive Framework for Hierarchical Implicit Discourse Relation Recognition (ACL 2023)"

License

Notifications You must be signed in to change notification settings

YJiangcm/GOLF_for_IDRR

Repository files navigation

Global and Local Hierarchy-aware Contrastive Framework for Hierarchical Implicit Discourse Relation Recognition (ACL 2023)

arXiv preprint: https://arxiv.org/abs/2211.13873

Requirements

  1. Install PyTorch by following the instructions from the official website.

  2. Install torch_geometric by following the instructions from the official website.

  3. Run the following script to install the remaining dependencies,

pip install -r requirements.txt

Data Preparation before Training

  1. Download the PDTB 2.0 dataset, put it under /raw/
  2. Run the following script for data preprocessing,
python3 preprocess.py

(P.S. PDTB 3.0 can be downloaded from https://catalog.ldc.upenn.edu/LDC2019T05. You can easily modify preprocess.py and adapt it to PDTB 3.0.)

Train, Evaluate, and Test

Run the following script for training, evaludating, and testing,

python3 run.py

(Our code can be easily run on a single NVIDIA GeForce RTX 3090)

Citation

If you find this work helpful, please cite our paper by:

@inproceedings{jiang-etal-2023-global,
    title = "Global and Local Hierarchy-aware Contrastive Framework for Implicit Discourse Relation Recognition",
    author = "Jiang, Yuxin  and
      Zhang, Linhan  and
      Wang, Wei",
    booktitle = "Findings of the Association for Computational Linguistics: ACL 2023",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.findings-acl.510",
    pages = "8048--8064",
}

About

Code for "Global and Local Hierarchy-aware Contrastive Framework for Hierarchical Implicit Discourse Relation Recognition (ACL 2023)"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages