Learning better ways to solve systems of linear equations

eigen least-squares linear-systems normal-equations optimization

I’ve been going into this numerical methods vortex for a couple months, which may or may not turn out to have been a good thing – and am learning a lot. In a post on N-view triangulation, there’s a linear least-squares minimization problem and I suggested solving it with the normal equations:

This is starting to look like a least squares problem! Ok, rearranging,

\[\mathbf{A}_{1:3} \begin{bmatrix} X_w\\ Y_w\\ Z_w \end{bmatrix} = -\mathbf{A}_{4}\]
\[\begin{bmatrix} X_w\\ Y_w\\ Z_w \end{bmatrix} = -(\mathbf{A}_{1:3}^T\mathbf{A}_{1:3})^{-1}\mathbf{A}_{1:3}^T\mathbf{A}_{4}\]

And I didn’t even know the term normal equations, the \(x^{*} = (A^TA)^{-1}A^Tb\) formulation was the way solving linear least squares minimization problems of \(Ax=b\) was presented in my optimization class (where we solved these problems by hand during tests). I’ve since found out (not today, but over time) that the normal equation route is the least stable, you’re better off using a decomposition-based method (SVD, for example). This is explained in Appendix 5 of Hartley and Zisserman 2003.

When it comes to getting code to work, the Eigen docs have a really good page on solving linear least squares with the pros and cons of the different methods/ implementations. For example:

The three methods discussed on this page are the SVD decomposition, the QR decomposition and normal equations. Of these, the SVD decomposition is generally the most accurate but the slowest, normal equations is the fastest but least accurate, and the QR decomposition is in between.


[MVG2004] Richard Hartley and Andrew Zisserman, Multiple View Geometry in Computer Vision, 2nd edition. 2004. Cambridge University Press, ISBN: 0521540518. link.

© Amy Tabb 2018 - 2023. All rights reserved. The contents of this site reflect my personal perspectives and not those of any other entity.