2182

Автор(ы): 

Автор(ов): 

4

Параметры публикации

Тип публикации: 

Доклад

Название: 

Generalization Error Bounds for Aggregation by Mirror Descent With Averaging

ISBN/ISSN: 

ISBN: 9780262195348

Наименование конференции: 

  • Annual Conference on Neural Information Processing Systems (NIPS, Vancouver, B.C., Canada)

Наименование источника: 

  • Proceedings of the Annual Conference on Neural Information Processing Systems (NIPS, Vancouver, B.C., Canada)

Обозначение и номер тома: 

http://leon.bottou.org/papers/nips-2005

Город: 

  • Cambridge

Издательство: 

  • MIT Press

Год издания: 

2006

Страницы: 

603-610
Аннотация
We consider the problem of constructing an aggregated estimator from a finite class of base functions which approximately minimizes a convex risk functional under the $\ell-1$ constraint. For this purpose, we propose a stochastic procedure, the mirror descent, which performs gradient descent in the dual space. The generated estimates are additionally averaged in a recursive fashion with specific weights. Mirror descent algorithms have been developed in different contexts and they are known to be particularly efficient in high dimensional problems. Moreover their implementation is adapted to the online setting. The main result of the paper is the upper bound on the convergence rate for the generalization error.

Библиографическая ссылка: 

Юдицкий А.Б., Назин А.В., Цыбаков А.Б., Ваятис Н.Н. Generalization Error Bounds for Aggregation by Mirror Descent With Averaging / Proceedings of the Annual Conference on Neural Information Processing Systems (NIPS, Vancouver, B.C., Canada). Cambridge: MIT Press, 2006. http://leon.bottou.org/papers/nips-2005. С. 603-610.