Momentum first-order optimization methods are the workhorses in various optimization
tasks, e.g., in the training of deep neural networks. Recently, Lucas et al. (2019) proposed a method called Aggregated Heavy-Ball (AggHB) that uses multiple momentum vectors
corresponding to different momentum parameters and averages these vectors to compute the
update direction at each iteration. Lucas et al. (2019) show that AggHB is more stable than
the classical Heavy-Ball method even with large momentum parameters and performs well in
practice. However, the method was analyzed only for quadratic objectives and for online optimization tasks under uniformly bounded gradients assumption, which is not satisfied for many
practically important problems. In this work, we address this issue and propose the first analysis of AggHB for smooth objective functions in non-convex, convex, and strongly convex cases
without additional restrictive assumptions. Our complexity results match the best-known ones
for the Heavy-Ball method. We also illustrate the efficiency of AggHB numerically on several
non-convex and convex problems.