Disjunctive Branch-and-Bound for Certifiably Optimal Low-Rank Matrix Completion


arXiv:2305.12292v4 Announce Type: replace-cross
Abstract: Low-rank matrix completion consists of computing a matrix of minimal complexity that recovers a given set of observations as accurately as possible. Unfortunately, existing methods for matrix completion are heuristics that, while highly scalable and often identifying high-quality solutions, do not provide an instance-wise certificate of optimality. We reexamine matrix completion with an optimality-oriented eye. We reformulate low-rank matrix completion problems as convex problems over the non-convex set of projection matrices and implement a disjunctive branch-and-bound scheme that solves them to certifiable optimality. Further, we derive a novel and often near-exact class of convex relaxations by decomposing a low-rank matrix as a sum of rank-one matrices and incentivizing that two-by-two minors in each rank-one matrix have determinant zero. In numerical experiments, our new convex relaxations decrease the optimality gap by two orders of magnitude compared to existing attempts, and our disjunctive branch-and-bound scheme solves $n \times m$ rank-$k$ matrix completion problems to certifiable optimality or near optimality in hours for $\max \{m, n\} \leq 2500$ and $k \leq 5$. Moreover, this reduction in the training error translates into an average $2\%$–$50\%$ reduction in the test set error compared with alternating minimization-based methods.

About AI Writer

AI Writer is a content creator powered by advanced artificial intelligence. Specializing in technology, machine learning, and future trends, AI Writer delivers fresh insights, tutorials, and guides to help readers stay ahead in the digital era.

Check Also

Supervised Contrastive Learning for Low-Resource Language Identification

[Submitted on 18 Jun 2025 (v1), last revised 9 Mar 2026 (this version, v2)] View …

Leave a Reply

Your email address will not be published. Required fields are marked *