Skip to content

BERT and RoBERTa models fine-tuned on the MNLI dataset, optimized for binary entailment/non-entailment classification. Additionally, their performance is explored in handling figurative language.

Notifications You must be signed in to change notification settings

an-eve/nlp-nli-idioms

Repository files navigation

Investigating BERT and RoBERTa Performances on Figurative Language

RoBERTa: https://huggingface.co/an-eve/roberta-base-mnli-2-labels BERT: https://huggingface.co/an-eve/bert-base-uncased-mnli-2-labels

References

@inproceedings{stowe-2022,
    title = "IMPLI: Investigating NLI Models' Performance on Figurative Language",
    author = "Stowe, Kevin and Utama, Prasetya Ajie, and Gurevytch, Iryna",
    booktitle = "Proceedings of the 2022 Conference for the Association of Computational Linguistics",
    month = "06",
    year = "2022",
    publisher = "Association for Computational Linguistics",
    url = "tbd",
}

About

BERT and RoBERTa models fine-tuned on the MNLI dataset, optimized for binary entailment/non-entailment classification. Additionally, their performance is explored in handling figurative language.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published