Skip to content

ELO-Lab/E-TF-MOENAS

Repository files navigation

E-TF-MOENAS: Enhanced Training-free Multi-Objective Neural Architecture Search

MIT licensed

Ngoc Hoang Luong, Quan Minh Phan, An Vo, Tan Ngoc Pham, Dzung Tri Bui

Setup

  • Clone repo.
  • Install necessary packages.
$ cd E-TF-MOENAS
$ bash install.sh

In our experiments, we do not use directly the API benchmarks published in their repos (i.e., NAS-Bench-101, NAS-Bench-201). Instead, we access their databases and only logging necessary information to create smaller-size databases.

You can compare our databases and the original databases in check_log_database.ipynb

Reproducing the results

You can reproduce the results in our article by running the below script:

$ python main.py --problem [NAS101, NAS201-C10] --algorithm <search-strategy> 

All problems in our article are bi-objective minimization problems. The objectives consist of the test error rate and one of the complexity/efficiency metrics (e.g., FLOPs, #params, latency).

To simulate the real-world NAS settings (the test performance can not be accessed), the search strategies have to optimize another metric instead of the test error rate during the search. Here are the list of supported optimizers and their used performance metrics during the search process:

Search Strategy Description NAS-Bench-101 NAS-Bench-201
val_error Optimizer validation error (epoch 12th) ✔️ ✔️
val_loss Optimizer validation loss (epoch 12th) ✔️
train_loss Optimizer training loss (epoch 12th) ✔️
synflow Optimizer Synaptic Flow metric ✔️ ✔️
jacov Optimizer Jacobian Covariance metric ✔️ ✔️
snip Optimizer SNIP metric ✔️ ✔️
grad_norm Optimizer Grad Norm metric ✔️ ✔️
grasp Optimizer GRASP metric ✔️ ✔️
fisher Optimizer Fisher metric ✔️ ✔️
E-TF-MOENAS Optimizer Synaptic Flow and Jacobian Covariance ✔️ ✔️
E-TF-MOENAS-C Optimizer the sum of Synaptic Flow and Jacobian Covariance ✔️ ✔️
Free_MOENAS Optimizer the sum of three training-free metrics reported in FreeREA ✔️ ✔️
MOENAS_PSI Similar to val_error but perform Pareto Local Search at each generation ✔️ ✔️
MOENAS_TF_PSI Similar to MOENAS_PSI but perform training-free Pareto Local Search ✔️ ✔️
ENAS_TFI Similar to val_error but perform a training-free warm-up stage at the beginning of the search ✔️ ✔️

Note #1: All variants use NSGA-II as the search optimizer.

Note #2: In our article, we only report the performance of algorithms obtained on CIFAR-10. But you also search on CIFAR-100 and ImageNet16-120 (NAS-Bench-201) by changing value of --problem hyperparameter .

Search with different hyperparameters

Moreover, you can search with different hyperparameter settings.

For environment

  • problem: NAS201-C100, NAS201-IN16
  • f0: the complexity/efficiency objective (flops, params, latency (latency is only supported in NAS-Bench-201))
  • n_run: the number times of running algorithms (default: 31)
  • max_eval: the maximum number of evaluation each run (default: 3000)
  • init_seed: the initial random seed (default: 0)
  • res_path: the path for logging results (default: ./exp_res)
  • debug: print the search performance at each generation if debug is True (default: False)

For algorithms

  • pop_size: the population size

Transferability Evaluation (for NAS-Bench-201 only)

We evaluate the transferability of algorithms by evaluating the found architectures (search on CIFAR-10) on CIFAR-100 and ImageNet16-120 in our article.

Source code for transferability evaluation can be found here.

Visualization and T-test

Source code for results visualization can be found here (for NAS-Bench-101) and here (for NAS-Bench-201).

Supplementary Material

The Supplementary can be found here

Acknowledgement

Our source code is inspired by:

Citation

If you use our source code, please cite our work as:

N.H. Luong, Q.M. Phan, A. Vo, T.N. Pham, and D.T. Bui, Lightweight multi-objective evolutionary neural architecture search with low-cost proxy metrics,
Information Sciences, https://doi.org/10.1016/j.ins.2023.119856

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published