Skip to content

Official implementation of CoNSAL for analytical Lyapunov function discovery

Notifications You must be signed in to change notification settings

HaohanZou/CoNSAL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Combining Neural Networks and Symbolic Regression for Analytical Lyapunov Function Discovery.

This repository contains the source code for the Paper:

Combining Neural Networks and Symbolic Regression for Analytical Lyapunov Function Discovery

Jie Feng, Haohan Zou, and Yuanyuan Shi

Overwiew

In this work, we propose CoNSAL(Combining Neural networks and Symbolic regression for Analytical Lyapunov function) to construct analytical Lyapunov functions for nonlinear dynamic systems. Compared with previous results, our algorithm directly produces an analytical form of the Lyapunov function with improved interpretability in both the learning process and the final results.

Plot

Training Stage: The learner employs an Input Convex Neural Network (ICNN) to minimize the Lyapunov risk and identify a neural Lyapunov function.

Visualization of Lie Derivative of the Neural Lyapunov Function under linear path following dynamics, updated with the proposed algorithm. Showing the result of the first two epochs:

Plot

Symbolic Regression Stage: The symbolic regression model from PySR approximates the neural network as analytical formulas.

Verification Stage: The proposed falsifier, root finding, checks the Lyapunov conditions on the analytical formula in specified state space of the dynamics.

SMT Counter Examples, Roots, and Root finding Counter Examples Visualization using the same checkpoint at epoch 1 for Lie derivative of the neural Lyapunov functions under linear path-following dynamics:

Plot

The same number of counterexamples are generated. The zoomed-in region shows the counterexamples from SMT solvers.

Requirements

A typical procedure is as follows:

  • Define the parameters for symbolic regression model
  • Define a dynamical system
  • Set checking conditions in root finding falsifier
  • Initialize the neural network with random parameters for neural Lyapunov function training
  • Start training and verifying
  • Procedure stops when no counterexample is found

The training process iteratively updates the parameters by minimizing the Lyapunov risk, a cost function that quantifies the violation of Lyapunov conditions. During the verification phase, counterexamples are periodically identified and added to the training set for subsequent iterations. This method enhances the neural network's ability to generalize and find a valid neural Lyapunov function.

Examples

How to train

# Train on Van Der Pol Oscillator

cd Van_Der_Pol_Oscillator
python Van_Der_Pol_Oscillator_training.py

# Train on Linear Path Following

cd Linear_Path_Following
python Linear_Path_Following_training.py

Citation

@misc{feng2024combiningneuralnetworkssymbolic,
      title={Combining Neural Networks and Symbolic Regression 
      for Analytical Lyapunov Function Discovery}, 
      author={Jie Feng and Haohan Zou and Yuanyuan Shi},
      year={2024},
      eprint={2406.15675},
      archivePrefix={arXiv},
      primaryClass={eess.SY},
      url={https://arxiv.org/abs/2406.15675}, 
}

About

Official implementation of CoNSAL for analytical Lyapunov function discovery

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages