Robust Multiple Model Fitting with Preference Analysis and Low-rank Approximation

L. Magri, A. Fusiello


Overview

This paper tackles the problem of geometric multi-model fitting which is aimed at extracting parametric models from unstructured data in order to organize and aggregate visual content in suitable higher-level geometric structures.

Method

The method we present reduces the multi-model fitting task to many easier single robust model estimation problems, by combining preference analysis and robust low rank approximation. Three main step can be single out in our appraoch.

At first data points are shifted in a conceptual space, where they are framed as a preference matrix. Our conceptual representation makes use of an M-estimator in order to model points preferences, in this way a first protection against outlier is achieved. The preference space is then equipped with a kernel, based on the Tanimoto distance, in this way an affinity matrix K, which measures the agreement between the preferences of points, is derived.
The second step is devoted to robustly segment points explointing the information encapsulated in K. This stage can be thought as a sort of ``robust spectral clustering''. It is well known that spectral clustering produces accurate segmentations in two steps: at first data are projected on the space of the first eigenvectors of the Laplacian matrix and then k-means is applied. The shortcoming of spectral clustering however is that it is not robust to outliers. We propose to follow the same scheme enforcing robustness exploiting the low rank nature of the problem. We decompose the affinity matrix as K= UU' +S.
The matrix S models the sparse preferences expressed by outliers, and is obtained by appling Robust PCA, which replaces the eigen-decomposition step of spectral clustering. The low rank part of K, representing symilarity between inliers, is hence decomposed as UU' taking advantage of Symmetric NMF, which plays the role of k-means. The obtained matrix U represents a soft segmentation of the data in which outliers are underweighted.
Finally, models are extracted inspecting the product of the preference matrix with a thresholded U, mimicking the MSAC strategy. The use of robust statistics for adaptatively estimate the inlier threshold constituites a third guard against the presence of outliers.


Code

MATLAB code available: RPA code
Required third party code: Inexact Augmented Lagrange Multiplier (ALM) Method , Symmetric Nonnegative Matrix Factorization

Reference paper