Title : Adaptive regularization algorithms for Objective-Function-Free Optimization
Abstract :
This work is about the fastest known optimization method in terms of complexity, which does not require the evaluation of the objective function. Such methods, termed Objective-Function-Free Optimization (OFFO), have gained popularity recently, especially in noisy problem contexts, including deep learning applications. We present an adaptive regularization OFFO algorithm for unconstrained nonconvex optimization. This algorithm is part of the adaptive regularization methods family, which is known for its optimal worst-case complexity results in the conventional framework where the objective function is evaluated. We demonstrate that these outstanding complexity bounds are maintained with the new algorithm, even though it relies on significantly less information. Specifically, we show that using derivatives of degree one to p, the algorithm will find an 𝛆1-approximate first-order minimizer in at most O(𝛆-(p+1)/p) iterations, and an (𝛆1, 𝛆2)-approximate second-order minimizer in at most O(𝛆-(p+1)/p, 𝛆-(p+1)/(p-1)}) iterations. As a particular case, the algorithm, when using first and second derivatives and applied to functions with Lipschitz continuous Hessian, will find an iterate xk where the gradient’s norm is less than 𝛆1 in at most O(𝛆-3/2) iterations. Numerical experiments demonstrate the excellent performance of this method for noisy problems, highlighting its robustness and efficiency across various scenarios.
The seminar will take place in Room S08 at the Faculty of Sciences.