On the derivatives of feed-forward neural networks
- Resource Type
- Working Paper
- Authors
- Khalek, Rabah Abdul; Bertone, Valerio
- Source
- Subject
- Physics - Computational Physics
High Energy Physics - Phenomenology
- Language
In this paper we present a C++ implementation of the analytic derivative of a feed-forward neural network with respect to its free parameters for an arbitrary architecture, known as back-propagation. We dubbed this code NNAD (Neural Network Analytic Derivatives) and interfaced it with the widely-used ceres-solver minimiser to fit neural networks to pseudodata in two different least-squares problems. The first is a direct fit of Legendre polynomials. The second is a somewhat more involved minimisation problem where the function to be fitted takes part in an integral. Finally, using a consistent framework, we assess the efficiency of our analytic derivative formula as compared to numerical and automatic differentiation as provided by ceres-solver. We thus demonstrate the advantage of using NNAD in problems involving both deep or shallow neural networks.