Home »

Qr decomposition

The meaning of «qr decomposition»

In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix A into a product A = QR of an orthogonal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the QR algorithm.

Any real square matrix A may be decomposed as

where Q is an orthogonal matrix (its columns are orthogonal unit vectors meaning Q T = Q − 1 {\displaystyle Q^{\textsf {T}}=Q^{-1}} ) and R is an upper triangular matrix (also called right triangular matrix). If A is invertible, then the factorization is unique if we require the diagonal elements of R to be positive.

If instead A is a complex square matrix, then there is a decomposition A = QR where Q is a unitary matrix (so Q ∗ = Q − 1 {\displaystyle Q^{*}=Q^{-1}} ).

If A has n linearly independent columns, then the first n columns of Q form an orthonormal basis for the column space of A. More generally, the first k columns of Q form an orthonormal basis for the span of the first k columns of A for any 1 ≤ k ≤ n.[1] The fact that any column k of A only depends on the first k columns of Q is responsible for the triangular form of R.[1]

More generally, we can factor a complex m×n matrix A, with m ≥ n, as the product of an m×m unitary matrix Q and an m×n upper triangular matrix R. As the bottom (m−n) rows of an m×n upper triangular matrix consist entirely of zeroes, it is often useful to partition R, or both R and Q:

where R1 is an n×n upper triangular matrix, 0 is an (m − n)×n zero matrix, Q1 is m×n, Q2 is m×(m − n), and Q1 and Q2 both have orthogonal columns.

Golub & Van Loan (1996, §5.2) call Q1R1 the thin QR factorization of A; Trefethen and Bau call this the reduced QR factorization.[1] If A is of full rank n and we require that the diagonal elements of R1 are positive then R1 and Q1 are unique, but in general Q2 is not. R1 is then equal to the upper triangular factor of the Cholesky decomposition of A* A (= ATA if A is real).

Analogously, we can define QL, RQ, and LQ decompositions, with L being a lower triangular matrix.

There are several methods for actually computing the QR decomposition, such as by means of the Gram–Schmidt process, Householder transformations, or Givens rotations. Each has a number of advantages and disadvantages.

Consider the Gram–Schmidt process applied to the columns of the full column rank matrix A = [ a 1 ⋯ a n ] {\displaystyle A={\begin{bmatrix}\mathbf {a} _{1}&\cdots &\mathbf {a} _{n}\end{bmatrix}}} , with inner product ⟨ v , w ⟩ = v T w {\displaystyle \langle \mathbf {v} ,\mathbf {w} \rangle =\mathbf {v} ^{\textsf {T}}\mathbf {w} } (or ⟨ v , w ⟩ = v ∗ w {\displaystyle \langle \mathbf {v} ,\mathbf {w} \rangle =\mathbf {v} ^{*}\mathbf {w} } for the complex case).

Related Searches

Schur decomposition
© 2015-2021, Wikiwordbook.info
Copying information without reference to the source is prohibited!
contact us mobile version