Suppose we have a problem that can be modeled by the system of equations Ax = b with a matrix A and a vector b. We have already shown how to use Gaussian Elimination to solve these methods, but I would like to introduce you to another method – QR decomposition. In addition to solving the general system of equations, this method can also be used in a number of other algorithms including the linear regression analysis to solve the least squares problem and to find the eigenvalues of a matrix.
Generally, when we see a system of equations, the best way to proceed in solving it is Gaussian elimination, or LU decomposition. However, there are some very special matrices where this method can lead to rounding errors. In such cases, we need a better, or more stable algorithm, which is when the QR decomposition method becomes important.
Any matrix A, of dimensions m by n with m >= n, can be represented as the product of two matrices, an m by m orthogonal matrix Q, i.e. Q^{T}Q = I, and an m by n upper triangular matrix R with the form [R 0]^{T}. We perform this decomposition by will first converting the columns of the matrix A into vectors. Then, for each vector a_{k}, we will calculate the vectors u_{k} and e_{k} given by
u_{k} = _{i = 1 to k1}proj_{ej} a_{k}
and
e_{k} = u_{k} / u_{k}
Then Q = [e_{1}, …, e_{n}] and
R = 

Here is a link to the JavaScript program I wrote to show how QR Decomposition works.
 Simple Linear Regression (0.377)
 The GramSchmidt Process and Orthogonal Vectors (0.306)
 Gaussian Elimination (0.275)
 Learn About "the Other" Algebra (0.213)
 Covariance of Vectors (0.119)
Charles,
Could you please detail on your comment that rounding errors may occur with Gaussian elimination / LU factorization?
I would be happy to better understand this point.
Thanks
Sebastian