Welcome to the homepage of the

Mathematical Optimization Group

Department of Mathematics, University of Tübingen, Germany

Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms

M.C. Mukkamala and P. Ochs

Abstract:
Matrix Factorization is a popular non-convex objective, for which alternating minimization schemes are mostly used. They usually suffer from the major drawback that the solution is biased towards one of the optimization variables. A remedy is non-alternating schemes. However, due to a lack of Lipschitz continuity of the gradient in matrix factorization problems, convergence cannot be guaranteed. A recently developed remedy relies on the concept of Bregman distances, which generalizes the standard Euclidean distance. We exploit this theory by proposing a novel Bregman distance for matrix factorization problems, which, at the same time, allows for simple/closed form update steps. Therefore, for non-alternating schemes, such as the recently introduced Bregman Proximal Gradient (BPG) method and an inertial variant Convex--Concave Inertial BPG (CoCaIn BPG), convergence of the whole sequence to a stationary point is proved for Matrix Factorization. In several experiments, we observe a superior performance of our non-alternating schemes in terms of speed and objective value at the limit point.
pdf Bibtex arXiv Code
Latest update: 23.05.2019
Citation:
M.C. Mukkamala, P. Ochs:
Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms. [pdf]
Conference on Neural Information Processing Systems (NeurIPS), 2019.
Bibtex:
@inproceedings{MO19b,
  title        = {Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms},
  author       = {M.C. Mukkamala and P. Ochs},
  year         = {2019},
  booktitle    = {Conference on Neural Information Processing Systems (NeurIPS)},
}


MOP Group
©2017-2022
The author is not
responsible for
the content of
external pages.