59722

Автор(ы): 

Автор(ов): 

2

Параметры публикации

Тип публикации: 

Статья в журнале/сборнике

Название: 

Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds

Электронная публикация: 

Да

ISBN/ISSN: 

1029-4945

DOI: 

10.1080/02331934.2020.1812066

Наименование источника: 

  • Optimization: A Journal of Mathematical Programming and Operations Research

Город: 

  • London

Издательство: 

  • Taylor & Francis

Год издания: 

2020

Страницы: 

1-20 https://www.tandfonline.com/doi/full/10.1080/02331934.2020.1812066
Аннотация
We analyse the convergence of the gradient projection algorithm, which is finalized with the Newton method, to a stationary point for the problem of nonconvex constrained optimization min x ∈ S f ( x ) with a proximally smooth set S = { x ∈ R n : g ( x ) = 0 } , g : R n → R m and a smooth function f. We propose new Error bound (EB) conditions for the gradient projection method which lead to the convergence domain of the Newton method. We prove that these EB conditions are typical for a wide class of optimization problems. It is possible to reach high convergence rate of the algorithm by switching to the Newton method.

Библиографическая ссылка: 

Балашов М.В., Тремба А.А. Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds // Optimization: A Journal of Mathematical Programming and Operations Research. 2020. С. 1-20 https://www.tandfonline.com/doi/full/10.1080/02331934.2020.1812066.