Skip to content
/ UMT Public
forked from jefferyYu/UMT

Preprocessed Datasets for our Multimodal NER paper

Notifications You must be signed in to change notification settings

SilyRab/UMT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Unified Multimodal Transformer (UMT) for Multimodal Named Entity Recognition (MNER)

Two MNER Datasets and Codes for our ACL'2020 paper: Improving Multimodal Named Entity Recognition via Entity Span Detection with Unified Multimodal Transformer.

Author

Jianfei Yu

[email protected]

July 1, 2020

Data

Requirement

  • PyTorch 1.0.0
  • Python 3.7

Code Usage

Training for UMT

  • This is the training code of tuning parameters on the dev set, and testing on the test set. Note that you can change "CUDA_VISIBLE_DEVICES=2" based on your available GPUs.
sh run_mtmner_crf.sh
  • We show our running logs on twitter-2015 and twitter-2017 in the folder "log files". Note that the results are a little bit lower than the results reported in our paper, since the experiments were run on different servers.

Acknowledgements

  • Using these two datasets means you have read and accepted the copyrights set by Twitter and dataset providers.

About

Preprocessed Datasets for our Multimodal NER paper

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.7%
  • Shell 0.3%