The process of machine learning can be considered in two stages model selection and parameter estimation. In this paper a technique is presented for constructing dynamical systems with desired qualitative properties. The approach is based on the fact that an n-dimensional nonlinear dynamical system can be decomposed into one gradient and (n-1) Hamiltonian systems. Thus, the model selection stage consists of choosing the gradient and Hamiltonian portions appropriately so that a certain behavior is obtainable. To estimate the parameters, a stably convergent learning rule is presented. This algorithm has been proven to converge to the desired system trajectory for all initial conditions and system inputs. This technique can be used to design neural network models which are guaranteed to solve the trajectory learning problem.
Advances in Neural Information Processing Systems: 274-280
Abdallah, Chaouki T.; James W. Howse; and Gregory L. Heileman. "Gradient and Hamiltonian dynamics applied to learning in neural networks." (2012). https://digitalrepository.unm.edu/ece_fsp/64