82488

Автор(ы): 

Автор(ов): 

2

Параметры публикации

Тип публикации: 

Статья в журнале/сборнике

Название: 

Unraveling the Hessian: A Key to Smooth Convergence in Loss Function Landscapes

ISBN/ISSN: 

1064-5624

DOI: 

10.1134/S1064562424601987

Наименование источника: 

  • Doklady Mathematics

Обозначение и номер тома: 

Т. 110, №S1

Город: 

  • New York

Издательство: 

  • Pleiades Publishing Ltd

Год издания: 

2024

Страницы: 

S49-S61
Аннотация
The loss landscape of neural networks is a critical aspect of their training, and understanding its properties is essential for improving their performance. In this paper, we investigate how the loss surface changes when the sample size increases, a previously unexplored issue. We theoretically analyze the convergence of the loss landscape in a fully connected neural network and derive upper bounds for the difference in loss function values when adding a new object to the sample. Our empirical study confirms these results on various datasets, demonstrating the convergence of the loss function surface for image classification tasks. Our findings provide insights into the local geometry of neural loss landscapes and have implications for the development of sample size determination techniques.

Библиографическая ссылка: 

Киселев Н.С., Грабовой А.В. Unraveling the Hessian: A Key to Smooth Convergence in Loss Function Landscapes / Doklady Mathematics. New York: Pleiades Publishing Ltd, 2024. Т. 110, №S1. С. S49-S61.