The Role of Model Architecture and Scale in Predicting Molecular Properties: Insights from Fine-Tuning RoBERTa, BART, and LLaMA
This code was built based on our previous project The Role of Model Architecture and Scale in Predicting Molecular Properties: Insights from Fine-Tuning RoBERTa, BART, and LLaMA.
To install requirements:
pip install -r requirements_cuda118.txt
📋 The experiments were done under CUDA 11.8
./dataset_ft/Abraham***_cleared.csv
already preprocessed.
To train the model(s) in the paper, move to AbraLLaMA
(the main directory) and run:
python run_auto_llama.py
To check the model's metrics, loss, and etc., move to AbraLLaMA/evaluations
(the main directory):
metric_1(RMSE), metric_2/loss(MAE)
We have used one of the pretrained ChemLLaMA-MTR model from our previous project
./model_mtr/ChemLlama_Medium_30m_vloss_val_loss=0.029_ep_epoch=04.ckpt
You can also train AbraLLaMA demo version wiht Jupyter
- Open
run_demo.ipynb
📋 MIT
Please use this code only for social goods and positive impact.