Skip to content

Supporting code for the paper "Domain Adaptation of Transformers for English Word Segmentation".

License

Notifications You must be signed in to change notification settings

ruanchaves/BERT-WS

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Domain Adaptation of Transformers for English Word Segmentation

Supporting code for the paper "Domain Adaptation of Transformers for English Word Segmentation".

This is a only slightly modified version of the original repository to work with Western languages.

In order to reproduce our results, please utilize the BERT-Mini model made available at googleresearch/bert instead of BERT-Base Chinese.

For more information about this project, please refer to the original README.

Citation

@InProceedings{10.1007/978-3-030-61377-8_33,
author="Rodrigues, Ruan Chaves
and Rocha, Acquila Santos
and Inuzuka, Marcelo Akira
and do Nascimento, Hugo Alexandre Dantas",
editor="Cerri, Ricardo
and Prati, Ronaldo C.",
title="Domain Adaptation of Transformers for English Word Segmentation",
booktitle="Intelligent Systems",
year="2020",
publisher="Springer International Publishing",
address="Cham",
pages="483--496",
abstract="Word segmentation can contribute to improve the results of natural language processing tasks on several problem domains, including social media sentiment analysis, source code summarization and neural machine translation. Taking the English language as a case study, we fine-tune a Transformer architecture which has been trained through the Pre-trained Distillation (PD) algorithm, while comparing it to previous experiments with recurrent neural networks. We organize datasets and resources from multiple application domains under a unified format, and demonstrate that our proposed architecture has competitive performance and superior cross-domain generalization in comparison with previous approaches for word segmentation in Western languages.",
isbn="978-3-030-61377-8"
}

About

Supporting code for the paper "Domain Adaptation of Transformers for English Word Segmentation".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 70.2%
  • Jupyter Notebook 27.4%
  • Shell 2.3%
  • Dockerfile 0.1%