Nathan’s paper on “Almost Surely Stable Deep Dynamics” has been accepted at NeurIPS 2020 and selected as a Spotlight session. Congratulations! At NeurIPS 2020, 385 papers out of 1900 were selected as spotlights or orals. 1900 papers were accepted out of ~11,000 submissions.
Almost Surely Stable Deep Dynamics by Nathan P. Lawrence, Philip D. Loewen, Michael G. Forbes, Johan U. Backstrom and R. Bhushan Gopaluni
We introduce a method for learning provably stable deep neural network based dynamic models from observed data. Specifically, we consider discrete-time stochastic dynamic models, as they are of particular interest in practical applications such as estimation and control. However, these aspects exacerbate the challenge of guaranteeing stability of a neural network dynamic model. Our method constrains the dynamic model to be stable subject to a neural network Lyapunov function. To this end, we propose two approaches: one exploits convexity of the Lyapunov function, while the other enforces stability through an implicit output layer. Numerical results are presented and the accompanying code is (will be) publicly available.
Read the pre-print: 2020C6_Lawrence_NeurIPS.pdf
Poster available on Figshare: Almost Surely Stable Deep Dynamics Poster