Skip to content

This repository includes code to reproduce the tables in "Loss Landscapes are All You Need: Neural Network Generalization Can Be Explained Without the Implicit Bias of Gradient Descent"

Notifications You must be signed in to change notification settings

Ping-C/optimizer

Repository files navigation

Loss Landscapes are All You Need: Neural Network Generalization Can Be Explained Without the Implicit Bias of Gradient Descent

This repository trains large number of models in parallel with non-gradient based optimizers.

To set up the environment, you could use conda with conda env create -f environment.yml

All scripts for reproducing the tables in the paper "Loss Landscapes are All You Need: Neural Network Generalization Can Be Explained Without the Implicit Bias of Gradient Descent") - ICLR 2023 can be found in ./scripts.

train_distributed.py trains models in parallel on different host and then save the resulting metrics in a single shared sqllite database.

About

This repository includes code to reproduce the tables in "Loss Landscapes are All You Need: Neural Network Generalization Can Be Explained Without the Implicit Bias of Gradient Descent"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published