A review of nonconvex stochastic subgradient descent
Pascal Bianchi  1  
1 : Télécom Paris
Télécom Paris, Télécom-Paris

The aim of the stochastic gradient descent (SGD) and its variants, is to approximate a local minimizer of a unknown function, which is revealed along the iterations. This talk intends to review convergence results in the case where the function is nonconvex and nondifferentiable. This includes almost-sure convergence, fluctuations, and avoidance of spurious critical points.


Personnes connectées : 1 Vie privée
Chargement...