Authors: Stefania Bellavia, Nataša Krejić, Nataša Krklec Jerinkić, Marcos Raydan
The spectral gradient method is known to be a powerful low-cost tool for solving large-scale optimization problems. In this paper, our goal is to exploit its advantages in the stochastic optimization framework, especially in the case of mini-batch subsampling that is often used in big data settings.ellavia, Nataša Krejić, Nataša Krklec Jerinkić, Marcos Raydan
Authors: Nataša Krejić, Nataša Krklec Jerinkić, Angeles Martinez, Mahsa Yousefi
In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained, nonlinear, and non-convex optimization problems arising in the training of deep neural networks.