Skip to content

In MT-BERT we reproduce a neural language understanding model which implements a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple NLU tasks.

Notifications You must be signed in to change notification settings

ABaldrati/MT-BERT

Repository files navigation

MT-BERT

Table of Contents

About The Project

In MT-BERT we reproduce a neural language understanding model based on the paper by Liu et al.(2019). Such model implements a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple NLU tasks. MT-DNN extends the model proposed in paper by Liu et al.(2015) by incorporating a pre-trained bidirectional transformer language model, known as BERT.

More details about the project are available in the presentation

Original implementation available at repo

Built With

Getting Started

To get a local copy up and running follow these simple steps.

Prerequisites

The project provide a Pipfile file that can be managed with pipenv. pipenv installation is strongly encouraged in order to avoid dependency/reproducibility problems.

  • pipenv
pip install pipenv

Installation

  1. Clone the repo
git clone https://gitlab.com/reddeadrecovery/mt-bert
  1. Install Python dependencies
pipenv install

Usage

Here's a brief description of each and every file in the repo:

  • model.py: Model definition
  • task.py: Task dataset preprocessing and definition
  • train_glue.py: Training file for Multi task training on GLUE
  • fine_tune_task.py: Fine tuning, domain adaptation and single task training file
  • utils.py: utils file

There is also a executable jupyter notebook:train.ipnyb

Authors

Acknowledgments

Machine Learning © Course held by Professor Paolo Frasconi - Computer Engineering Master Degree @University of Florence

About

In MT-BERT we reproduce a neural language understanding model which implements a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple NLU tasks.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages