Reduced singular value decomposition (SVD) decomposes $A$ with rank $r$ into a left singular vector matrix $U$, a diagonal singular value matrix $\Sigma$, and a right singular vector matrix $V$.
$U$ and $V$ are composed of orthonormal columns.
One way to calculate reduced SVD is as follows
Note, the last equation fits the pattern of eigendecomposition, $BX = X\Lambda$, so eigendecompose $A^TA$ would find $\Sigma^2$ and $V$.
Then, $U$ can be obtained by
Alternatively,
The last equation also fits the pattern of eigendecomposition, $BX = X\Lambda$, so eigendecompose $A A^T$ would find $\Sigma^2$ and $U$.
Then, $V$ can be found by
For comparison of full SVD vs reduced SVD vs truncated SVD, see https://math.stackexchange.com/questions/2627005/are-reduced-svd-and-truncated-svd-the-same-thing.
import sympy as sp
import svd_reduced_utils
# An example taken from http://www.d.umn.edu/~mhampton/m4326svd_example.pdf
A = sp.Matrix(
[
[3, 2, 2],
[2, 3, -2],
]
)
A
A.rank()
2
U, Σ, V = svd_reduced_utils.svd(A)
U
Σ
V.T
U @ Σ @ V.T
assert A == U @ Σ @ V.T
A = sp.Matrix(
[
[3, 2],
[2, 3],
[2, -2],
]
)
A
A.rank()
2
U, Σ, V = svd_reduced_utils.svd(A)
U
Σ
V.T
U @ Σ @ V.T
assert A == U @ Σ @ V.T
# An example taken from http://www.d.umn.edu/~mhampton/m4326svd_example.pdf
A = sp.Matrix(
[
[3, 2, 2],
[3, 2, 2],
]
)
A
A.rank()
1
U, Σ, V = svd_reduced_utils.svd(A)
U
Σ
V.T
U @ Σ @ V.T
assert A == U @ Σ @ V.T
A = sp.Matrix(
[
[3, 3],
[2, 2],
[2, 2],
]
)
A
A.rank()
1
U, Σ, V = svd_reduced_utils.svd(A)
U
Σ
V.T
U @ Σ @ V.T
assert A == U @ Σ @ V.T