See also my google scholar page.
[9] Sadiev, A., Danilova, M., Gorbunov, E., Horvath, S., Gidel, G., Dvurechensky, P., Gasnikov, A. and Richtarik, P., 2023. High-probability bounds for stochastic optimization and variational inequalities: the case of unbounded variance, accepted to ICML 2023.
[8] Gorbunov*, E., Danilova*, M., Dobre*, D., Dvurechenskii, P., Gasnikov, A. and Gidel, G., 2022. Clipped stochastic methods for variational inequalities with heavy-tailed noise, accepted to NeurIPS 2022.
[7] Danilova, M. and Gorbunov, E., 2022. Distributed methods with absolute compression and error compensation, accepted to MOTOR 2022.
[6] Danilova, M., 2022. On the Convergence Analysis of Aggregated Heavy-Ball Method, accepted to MOTOR 2022.
[5] Danilova, M., Dvurechensky, P., Gasnikov, A., Gorbunov, E., Guminov, S., Kamzolov, D. and Shibaev, I., 2022. Recent theoretical advances in non-convex optimization. In High-Dimensional Optimization and Probability: With a View Towards Data Science.
[4] Danilova, M. and Malinovsky, G., 2021. Averaged heavy-ball method. Com- puter Research and Modeling.
[3] Gorbunov, E., Danilova, M., Shibaev, I., Dvurechensky, P. and Gasnikov, A., 2021. Near-optimal high probability complexity bounds for non-smooth stochastic optimization with heavy-tailed noise.
[2] Gorbunov, E., Danilova, M. and Gasnikov, A., 2020. Stochastic optimization with heavy-tailed noise via accelerated gradient clipping, accepted to NeurIPS 2020.
[1] Danilova, M., Kulakova, A. and Polyak, B., 2020. Non-monotone behavior of the heavy ball method, accepted to the 24th ICDEA.
[10] NeurIPS 2022, New Orleans, USA.
Poster: ”Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise” (presented by E. Gorbunov). Links: poster.
[9] AI Journey 2022, Moscow, Russia.
Poster: ”Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise”.
[8] MOTOR 2022, Petrozavodsk, Russia.
Talk: ”On the Convergence Analysis of Aggregated Heavy-Ball Method”.
[7] MOTOR 2022, Petrozavodsk, Russia.
Talk: ”Distributed methods with absolute compression and error compensa- tion” (presented by E. Gorbunov).
[6] The 64th International MIPT Scientific Conference 2021, Moscow, Russia. Talk: ”Aggregated Momentum Gradient Method”.
[5] QIPA 2021, Sochi, Russia.
Talk: ”Averaged Heavy-Ball method”.
[4] Optimization without Borders 2021, Sochi, Russia.
Poster: ”Stochastic optimization with heavy-tailed noise via accelerated gradient clipping”.
[3] NeurIPS 2020, online.
Poster: ”Stochastic Optimization with HeavyTailed Noise via Accelerated Gradient Clipping”. Links: video, poster.
[2] The 24th ICDEA, Dresden, Germany.
Talk: ”Non-monotone behavior of the heavy ball method” (presented by A.Kulakova).
[1] Workshop ”Optimization algorithms and applications in statistical learning”, Grenoble, France.
Talk: ”The non-monotonocity effect of accelerated optimization methods”.
Created with Mobirise - Read more