Journal of Applied Mathematics and Physics

Volume 12, Issue 4 (April 2024)

ISSN Print: 2327-4352   ISSN Online: 2327-4379

Google-based Impact Factor: 1.00  Citations  

Almost Sure Convergence of Proximal Stochastic Accelerated Gradient Methods

HTML  XML Download Download as PDF (Size: 364KB)  PP. 1321-1336  
DOI: 10.4236/jamp.2024.124081    152 Downloads   686 Views  
Author(s)

ABSTRACT

Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.

Share and Cite:

Xiang, X. and Xia, H. (2024) Almost Sure Convergence of Proximal Stochastic Accelerated Gradient Methods. Journal of Applied Mathematics and Physics, 12, 1321-1336. doi: 10.4236/jamp.2024.124081.

Cited by

No relevant information.

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.