Skip to content

kalelpark/FT_TransFormer

Repository files navigation

Research on Tabular Deep Learning Model

This Project training 8 files at once as a Tabular Deep Learning model and stores Experimental results. additionaly use wandb.

For paper implementations, see the section "Papers and projects".

Setup the enviroment for evaluation

$cd Researh
$sh experiment.sh 

Default

Default inference. in python main.py --action train --model fttransformer --data microsoft --savepath output.

Results

We saved reult information in Output/model_name/data/default/info.json.

Datasets

We upload the datasets used in the paper with our train/val/test splits here. We do not impose additional restrictions to the original dataset licenses, the sources of the data are listed in the paper appendix.

You could load the datasets with the following commands:

conda activate tdl
cd $Researh
wget "https://www.dropbox.com/s/o53umyg6mn3zhxy/data.tar.gz?dl=1" -O rtdl_data.tar.gz
tar -zvf rtdl_data.tar.gz

File Structure

├── Data
│   ├── microsoft
│   │     └── ...
│   ├── yahoo
│   │     └── ...
│   └── etc..
├── Output
│   ├── ft-transformer
│   │     ├── microsoft
│   │     │     ├── default
│   │     │     └── ensemble
│   │     └── yahoo
│   │           └── etc..
│   └── resnet..
├── config.yaml     "Model Architecture parameters.."
├── experiment.sh
├── main.py
├── infer.py
├── train.py
├── model.py
├── utils.py
etc..

Papers and projects

Name Location Comment
Revisiting Pretrarining Objectives for Tabular Deep Learning link arXiv 2022
On Embeddings for Numerical Features in Tabular Deep Learning link arXiv 2022

How to cite

@article = {
    title = {Research on Tabular Deep Learning Model},
    author = {Wongi Park},
    journal = {GitHub},
    url = {https://github.com/kalelpark/DeepLearning-for-Tabular-Data},
    year = {2022},
}

About

DeepLearning for Tabular Data with FT_Transformer & ResNet

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published