Stochastic optimization plays an important role in solving many problems encountered in machine learning or adaptive processing In this context the second-order statistics of the data are often un-known a priori or their direct computation is too intensive and they have to be estimated online from the related signals In the context of batch optimization of an objective function being the sum of a data fidelity term and a penalization eg a sparsity promoting function Majorize-Minimize MM subspace methods have recently attracted much interest since they are fast highly flexible and effective in ensuring convergence The goal of this paper is to show how these methods can be successfully extended to the case when the cost function is replaced by a sequence of stochastic approximations of it Simulation results illustrate the good practical performance of the proposed MM Memory Gradient 3MG algorithm when applied to 2D filter identification
from HAL : Dernières publications http://ift.tt/1sXi9MX
from HAL : Dernières publications http://ift.tt/1sXi9MX
0 commentaires:
Enregistrer un commentaire