Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, Shirley Ho Accepted to ICLR 2020 Workshop on Deep Differential Equations
- Paper
- Blog
- Self-Contained Tutorial
- Paper example notebook: double pendulum
- Paper example notebook: special relativity
- Paper example notebook: wave equation
Warning To use our implementation with more recent versions of JAX, you change
jax.experimental.stax
tojax.example_libraries.stax
andjax.experimental.optimizers
tojax.example_libraries.optimizers
. Please raise an issue if there are other deprecated functionalities.
In this project we propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks. In contrast to Hamiltonian Neural Networks, these models do not require canonical coordinates and perform well in situations where generalized momentum is difficult to compute (e.g., the double pendulum). This is particularly appealing for use with a learned latent representation, a case where HNNs struggle. Unlike previous work on learning Lagrangians, LNNs are fully general and extend to non-holonomic systems such as the 1D wave equation.
Neural Networks | Neural ODEs | HNN | DLN (ICLR'19) | LNN (this work) | |
---|---|---|---|---|---|
Learns dynamics | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ |
Learns continuous-time dynamics | ✔️ | ✔️ | ✔️ | ✔️ | |
Learns exact conservation laws | ✔️ | ✔️ | ✔️ | ||
Learns from arbitrary coordinates | ✔️ | ✔️ | ✔️ | ✔️ | |
Learns arbitrary Lagrangians | ✔️ |
- Jax
- NumPy
- MoviePy (visualization)
- celluloid (visualization)
This project is written in Python 3.