We study conservation laws during the (euclidean or not) gradient or momentum flow of neural networks.

Keep the Momentum: Conservation Laws beyond Euclidean Gradient Flows, accepted at ICML24

1/ We define the **concept of conservation laws for momentum flows** and show how to extend the framework from our previous paper (Abide by the Law and Follow the Flaw: Conservation Laws for Gradient Flows, oral @NeurIPS23)** **for non-Euclidean gradient flow (GF) and momentum flow (MF) settings. **In stark contrast to the case of GF, conservation laws for MF exhibit temporal dependence**.

2/ We discover **new conservation laws** for linear networks in the Euclidean momentum case, and these new laws are complete. In contrast, **there is no conservation law for ReLU networks in the Euclidean momentum case**.

3/ **In a non-Euclidean context**, such as in NMF or for ICNN implemented with two-layer ReLU networks, **we discover new conservation laws for gradient flows** and find none in the momentum case. We obtain n**ew conservation laws in the Natural Gradient Flow case**.

4/ We shed light on a quasi-systematic loss of conservation when transitioning from the GF to the MF setting.