QR Decomposition

Suppose we have a problem that can be modeled by the system of equations Ax = b with a matrix A and a vector b. We have already shown how to use Gaussian Elimination to solve these methods, but I would like to introduce you to another method – QR decomposition. In addition to solving the general system of equations, this method can also be used in a number of other algorithms including the linear regression analysis to solve the least squares problem and to find the eigenvalues of a matrix.

Generally, when we see a system of equations, the best way to proceed in solving it is Gaussian elimination, or LU decomposition. However, there are some very special matrices where this method can lead to rounding errors. In such cases, we need a better, or more stable algorithm, which is when the QR decomposition method becomes important.

Any matrix A, of dimensions m by n with m >= n, can be represented as the product of two matrices, an m by m orthogonal matrix Q, i.e. QTQ = I, and an m by n upper triangular matrix R with the form [R 0]T. We perform this decomposition by will first converting the columns of the matrix A into vectors. Then, for each vector ak, we will calculate the vectors uk and ek given by

uk = i = 1 to k-1projej ak
and
ek = uk / ||uk||
Then Q = [e1, …, en] and

R =
<e1, a1><e1, a2><e1, a3>
0<e2, a2><e2, a3>
00<e3, a3>

Here is a link to the JavaScript program I wrote to show how QR Decomposition works.

Similar Posts:

One thought on “QR Decomposition”

  1. Charles,

    Could you please detail on your comment that rounding errors may occur with Gaussian elimination / LU factorization?

    I would be happy to better understand this point.

    Thanks
    Sebastian

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.