Skip to content
/ PaRe Public

[ICML 2024] Official Implementation of Enhancing Cross-Modal Fine-Tuning with Gradually Intermediate Modality Generation

Notifications You must be signed in to change notification settings

BIT-DA/PaRe

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Enhancing Cross-Modal Fine-Tuning with Gradually Intermediate Modality Generation

[ICML2024] Official Code of Enhancing Cross-Modal Fine-Tuning with Gradually Intermediate Modality Generation

Lincan Cai, Shuang Li, Wenxuan Ma, Jingxuan Kang, Binhui Xie, Zixun Sun and Chengwei Zhu

image

Contribution

  • We gradually constructs intermediate modalities from the source modality to the target modality, bridging the modality gap.
  • By mixing the source modality data with the target modality data to construct intermediate modality data, we can also alleviates the issue of insufficient data volume in the target modality.
  • Utilize Curriculum Learning, allowing the model to transition from intermediate modality data that is closer to the source modality to that is closer to the target modality. This enables a gradual transfer from easy to difficult tasks.

Requirements

  • Please refer to the Requirements of ORCA.
  • Download required datasets and precomputed language features text_xs.py and text_ys.py to ./src/datasets when using RoBERTa models for 1D datasets.

Experiment with NAS-Bench-360

Run the following command:

bash run_PaRe.sh

Citation

If you find this project useful in your research, please consider citing:

@inproceedings{caienhancing,
  title={Enhancing Cross-Modal Fine-Tuning with Gradually Intermediate Modality Generation},
  author={Cai, Lincan and Li, Shuang and Ma, Wenxuan and Kang, Jingxuan and Xie, Binhui and Sun, Zixun and Zhu, Chengwei},
  booktitle={Forty-first International Conference on Machine Learning}
}

Acknowledgements

This project is based on the project: ORCA. We thank the authors for making the source code publicly available.

About

[ICML 2024] Official Implementation of Enhancing Cross-Modal Fine-Tuning with Gradually Intermediate Modality Generation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published