QR Decomposition using Python
Python3
import numpy as np # Create a numpy array arr = np.array([[ 1 , 2 , 4 ], [ 0 , 0 , 5 ], [ 0 , 3 , 6 ]]) print (arr) # Find the QR factor of array q, r = np.linalg.qr(arr) print ( '\nQ:\n' , q) print ( '\nR:\n' , r) print (np.allclose(arr, np.dot(q, r))) # to check result is correct or not |
Output:
[[1 2 4]
[0 0 5]
[0 3 6]]
Q:
[[ 1. 0. 0.]
[ 0. 0. -1.]
[ 0. -1. 0.]]
R:
[[ 1. 2. 4.]
[ 0. -3. -6.]
[ 0. 0. -5.]]
True
Mathematical explantions
Let’s understand the QR Decomposition process by
Suppose we are provided with the matrix A:
As mentioned in the steps before, we will be using Gram-Schmidt Orthogonalization.
We will be finding orthogonal components q1 , q2 and q3 :
First, perform normalization and we get the first normalized vector:
The norm of the first column is calculated as:
The inner product of between a2 and q1 is " title="Rendered by QuickLaTeX.com" height="20" width="108" style="vertical-align: 28px;"> = is considered and the projection of the second column on is multiplied with the inner product.
is the residual of the projection:
q1 \\[10pt] \hspace{0.55cm} = \begin{bmatrix} 2 \\ 0 \\ 3 \end{bmatrix} - 2 * \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \\[10pt] \hspace{0.55cm} = \begin{bmatrix} 0 \\ 0 \\ 3 \end{bmatrix} \\[10pt] " title="Rendered by QuickLaTeX.com" height="247" width="300" style="vertical-align: 0px;">
Now, we will normalize the residual:
Now, we will project a3 on q1 and q2 :
q1 -
Now, we will normalize the residual. :
We got Q matrix.
The given R is an upper triangular matrix.
Mathematical Calculation (Q = [q1 q2 q3], so A = QR) value is different compared to python Numpy package. Reason described below.
Reason for difference of NumPy results and our calculation from steps:
The QR decomposition is not unique all the way down to the signs. One can flip signs in Q as long as you flip the corresponding signs in R. Some implementations enforce positive diagonals in R, but this is just a convention. Since NumPy defer to LAPACK for these linear algebra operations, we follow its conventions, which do not enforce such a requirement.
QR Decomposition in Machine learning
QR decomposition is a way of expressing a matrix as the product of two matrices: Q (an orthogonal matrix) and R (an upper triangular matrix). In this article, I will explain decomposition in Linear Algebra, particularly QR decomposition among many decompositions.