An efficient and deterministic consensus maximization method called IBCO (as the abbreviation of Iterative Bi-Convex Optimization).
Published in ECCV 2018 as oral presentations.
[Paper] [Oral Slides]
Consensus maximization is an effective tool for robust fitting in computer vision. This repository contains the demo for IBCO and other prevalent consensus maximization methods (see below for the list).
If you want to try some methods for your own problem, some personal suggestions are provided at the end of the method list :)
The demo is free for non-commercial academic use. Any commercial use is strictly prohibited without the authors' consent. Please acknowledge the authors by citing:
@inproceedings{cai2018deterministic,
title={Deterministic Consensus Maximization with Biconvex Programming},
author={Cai, Zhipeng and Chin, Tat-Jun and Le, Huu and Suter, David},
booktitle={European Conference on Computer Vision},
pages={699--714},
year={2018},
organization={Springer}
}
in any academic publications that have made use of this package or part of it.
Homepage:https://zhipengcai.github.io/
Email: [email protected]
Do not hesitate to contact the authors if you have any question or find any bugs :)
Random methods:
Deterministic methods:
-
The ADMM-based method (to be added)
(personal) Tips for method choosing:
-
The runtime of random methods is exponential to 1) the proportion of outliers and 2) the dimension of the problem. If your problem does not have much outliers (say < 50%) and a large dimension (say within 10), random methods are very efficient and usually provide reasonable results. Try method 4~6 in this case.
-
The runtime of deterministic methods is polynomial to the size of input data. If your data size is not extremely large, deterministic methods are fast even with high outlier rates. Try SS (method 8) if you want the highest efficiency. IBCO (method 10) usually returns the best solution among all deterministic methods.
-
Using random methods to initialize deterministic methods is an effective strategy. Deterministic methods are sensitive to initializations, but usually perform better than random methods with a reasonable initial solution. Random methods are capable of returning a reasonable solution, regardless of the initialization. Try Fixing LO-RANSAC + IBCO. IBCO is highly suitable for refining the solution of other local methods. Usually it is capable of increasing the consensus size of RANSAC variants by more than 10%.
-
In practice, if you want high efficiency for problems with high outlier rates or high dimensions, you can terminate random methods earlier and give the current best solution to IBCO.
Linear:
- Linear regression
Nonlinear:
-
Homography estimation
-
Triangulation (to be added)
With non-convex constraints:
- Fundamental matrix estimation (the rank-2 constraint)
IBCO bisects over all possible consensus sizes and returns the largest one such that the decision version of the consensus maximization problem (briefed as "decision problem" later on) is feasible.
The major contribution of IBCO is a "lifted" decision problem, which is equivalent to the original problem and bi-convex. Using this lifted form, we can solve each decision problem efficiently by standard bi-convex optimization.
The demo has been tested on Linux (Ubuntu 14.04 LTS 64-bit). Except USAC whose source code is in C++, all other methods are implemented using MATLAB 2017a.
Due to different languages, we currently do not include USAC in the main demo, while the source code is provided.
-
Clone this repository.
-
Run function "demo()" in MATLAB.
Please refer to "demo.m" file for detailed code explanations.
This demo uses 'sedumi' as the default linear programming solver, which is used in EP and SS. Gurobi was used instead in our experiments, which is faster than 'sedumi'.
If you have a valid license, you can follow the instruction in the "demo.m" file to switch 'sedumi' to 'Gurobi'.