Stochastic Mirror Descent dynamics

Speaker
Mathias Staudigl
Date
18/01/2018 - 13:00 - 11:30Add To Calendar 2018-01-18 11:30:00 2018-01-18 13:00:00 Stochastic Mirror Descent dynamics In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic behavior of gradient-like flows under stochastic disturbances. Specifically, we focus on the widely studied class of Mirror Descent schemes for convex programs with compact feasible regions, and we examine the dynamics' convergence and concentration properties in the presence of noise. In the vanishing noise limit, we show that the dynamics converge to the solution set of the underlying problem a.s. Otherwise, when the noise is persistent, we show that the dynamics are concentrated around interior solutions in the long run, and they converge to boundary solutions that are sufficiently ``sharp''. Finally, we show that a suitably rectified variant of the method converges irrespective of the magnitude of the noise (or the structure of the underlying convex program), and we derive an explicit estimate for its rate of convergence. Extensions to monotone Variational inequalities and non-convex problems are discussed. Related Papers: P. Mertikopoulos and M. Staudigl: On the Convergence of Gradient like flows with noisy gradient input:https://arxiv.org/abs/1611.06730 (forthcoming SIOPT) P. Mertikopoulos and M. Staudigl: Stochastic Mirror Descent Dynamics and their convergence in monotone Variational Inequalities:  https://arxiv.org/abs/1710.01551 (Submitted to JOTA) P. Mertikopoulos and M. Staudigl: Convergence to Nash equilibrium in continuous games with noisy first-order feedback  (http://mescal.imag.fr/membres/panayotis.mertikopoulos/files/ContinuousGames-CDC.pdf) CDC '17: Proceedings of the 56th IEEE Annual Conference on Decision and Control. Economics building (504), faculty lounge on the first floor אוניברסיטת בר-אילן - Department of Economics Economics.Dept@mail.biu.ac.il Asia/Jerusalem public
Place
Economics building (504), faculty lounge on the first floor
Affiliation
Maastricht University
Abstract

In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic behavior of gradient-like flows under stochastic disturbances. Specifically, we focus on the widely studied class of Mirror Descent schemes for convex programs with compact feasible regions, and we examine the dynamics' convergence and concentration properties in the presence of noise. In the vanishing noise limit, we show that the dynamics converge to the solution set of the underlying problem a.s. Otherwise, when the noise is persistent, we show that the dynamics are concentrated around interior solutions in the long run, and they converge to boundary solutions that are sufficiently ``sharp''. Finally, we show that a suitably rectified variant of the method converges irrespective of the magnitude of the noise (or the structure of the underlying convex program), and we derive an explicit estimate for its rate of convergence. Extensions to monotone Variational inequalities and non-convex problems are discussed.

Related Papers:

P. Mertikopoulos and M. Staudigl: On the Convergence of Gradient like flows with noisy gradient input:https://arxiv.org/abs/1611.06730 (forthcoming SIOPT)

P. Mertikopoulos and M. Staudigl: Stochastic Mirror Descent Dynamics and their convergence in monotone Variational Inequalities:  https://arxiv.org/abs/1710.01551 (Submitted to JOTA)

P. Mertikopoulos and M. Staudigl: Convergence to Nash equilibrium in continuous games with noisy first-order feedback  (http://mescal.imag.fr/membres/panayotis.mertikopoulos/files/ContinuousGames-CDC.pdf) CDC '17: Proceedings of the 56th IEEE Annual Conference on Decision and Control.

Last Updated Date : 04/12/2022