Skip to content

Source code for ACL2020: On the Robustness of Language Encoders against Grammatical Errors

License

Notifications You must be signed in to change notification settings

uclanlp/ProbeGrammarRobustness

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ProbeGrammarRobustness

Source code for our ACL2020 paper: On the Robustness of Language Encoders against Grammatical Errors

Dependencies

Python >= 3.5

Download and install berkeleyparser.

Install python requirments via requirments file: pip install -r requirements.txt

Preparation

Download datasets

The General Language Understanding Evaluation (GLUE) benchmark aims to analyze model ability in natural language understanding. We use some tasks of GLUE as downstream tasks.

You should follow the instructions in this repo to download GLUE benchmark and unpack it to your $data_dir.

The CoNLL-2014 Shared Task: Grammatical Error Correction is where we collect error statistics.

Follow the instructions in this page to download NUCLE Release3.2 and annotated test data.

Remember to change the file path in line 13 and 141 of utils/statistic_all.py to your own path.

Download pre-trained models

For experiments regarding Infersent, you need to download fastText embeddings and the corresponding pre-trained Infersent model.


curl -Lo crawl-300d-2M.vec.zip https://s3-us-west-1.amazonaws.com/fasttext-vectors/crawl-300d-2M.vec.zip
curl -Lo examples/infersent2.pkl https://dl.fbaipublicfiles.com/senteval/infersent/infersent2.pkl

Usage

Downstream task evaluations

Model layer evaluations

BERT masked language model evaluations

Acknowledgement

Our framework is developed based on PyTorch implementations of BERT and RoBERTa from PyTorch-Transformers, Infersent from SentEval, and ELMo from AllenNLP and Jiant.

We also borrowed and edited code from the following repos: nlp_adversarial_examples, nmt_grammar_noise, interpret_bert.

We would like to thank the authors of these repos for their efforts.

Citation

If you find our work useful, please cite our ACL2020 paper: On the Robustness of Language Encoders against Grammatical Errors


@inproceedings{yin2020robustnest,
  author = {Yin, Fan and Long, Quanyu and Meng, Tao and Chang, Kai-Wei},
  title = {On the Robustness of Language Encoders against Grammatical Errors},
  booktitle = {ACL},
  year = {2020}
}

About

Source code for ACL2020: On the Robustness of Language Encoders against Grammatical Errors

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published