63641

Автор(ы): 

Автор(ов): 

3

Параметры публикации

Тип публикации: 

Доклад

Название: 

Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

Наименование конференции: 

  • 34th Conference on Neural Information Processing Systems (NIPS, Vancouver, B.C., Canada)

Наименование источника: 

  • Advances in Neural Information Processing Systems 33 (NeurIPS 2020)

Город: 

  • Virtual

Издательство: 

  • Edited by: H. Larochelle and M. Ranzato and R. Hadsell and M.F. Balcan and H. Lin

Год издания: 

2020

Страницы: 

1-60 https://arxiv.org/pdf/2005.10785.pdf
Аннотация
In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed dis- tributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that out- perform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.

Библиографическая ссылка: 

Горбунов Э.А., Данилова М.Ю., Гасников А.В. Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping / Advances in Neural Information Processing Systems 33 (NeurIPS 2020). Virtual: Edited by: H. Larochelle and M. Ranzato and R. Hadsell and M.F. Balcan and H. Lin, 2020. С. 1-60 https://arxiv.org/pdf/2005.10785.pdf.