Notice that if $A = U\Sigma V^H$, then
$$ A^H A = V \Sigma U^H U \Sigma V^H = V \Sigma^2 V^H $$That is, to multiply $A^H A x$, you (1) compute $V^H x$ (the $V$ components of $x$), then (2) multiply each component by $\sigma^2$, and finally (3) multiply the coefficients by $V$ and add up. It follows that:
Similarly,
$$ A A^H = U \Sigma V^H V \Sigma U^H = U \Sigma^2 U^H $$so
We can easily check this:
A = randn(5,3)
5×3 Array{Float64,2}: 0.202935 -0.810741 0.379812 0.317852 1.17222 0.0789665 -1.58283 -0.524304 0.949145 0.122448 1.57466 0.693527 -0.496476 1.13621 0.511883
Note that in this case, $A$ is a $5 \times 3$ matrix of rank 3.
U, σ, V = svd(A)
σ.^2 # the σ² values
3-element Array{Float64,1}: 6.31957 4.01707 0.443596
$A^H A$ is a $3 \times 3$ matrix of rank 3 with three nonzero eigenvalues that equal the singular values squared:
eigvals(A'*A) # AᴴA has the same eigenvals!
3-element Array{Float64,1}: 0.443596 4.01707 6.31957
$AA^H$ is a $5 \times 5$ matrix of rank 3 (recall that the ranks of $A$, $AA^H$, and $A^H A$ are all equal!). It has 3 nonzero eigenvalues that equal the $\sigma^2$ values, and 2 zero eigenvalues corresponding to the two-dimensional nullspace $$ N(AA^H) = N(A^H) = C(A)^\perp = C(U)^\perp $$ That is, the zero eigenvectors are those perpendicuar to the left singular vectors $U$.
eigvals(A*A') # the same *nonzero* eigenvalues
5-element Array{Float64,1}: -1.71137e-16 8.87578e-16 0.443596 4.01707 6.31957
We can also check the eigenvectors, e.g. the eigenvectors of $A^H A$ should match V:
eigvecs(A'A)
3×3 Array{Float64,2}: 0.564298 -0.817673 0.113922 -0.203239 -0.00384465 0.979122 0.800163 0.57567 0.168353
V
3×3 Array{Float64,2}: -0.113922 -0.817673 -0.564298 -0.979122 -0.00384465 0.203239 -0.168353 0.57567 -0.800163
Yes, they match, up to an overall sign flip.
(Note that the columns are in reverse order, because svdvals
are by default sorted in descending order in Julia, whereas eigvals
of Hermitian matrices are sorted in ascending order.)
This, in principle, finally gives us a way to compute the SVD of a matrix: just find the eigenvectors and eigenvalues of $A^H A$. (Note that $AV = U\Sigma$, so that once you have $V$ and $\Sigma$ you can get $U$.)
In practice, computers use a different way to compute the SVD. (The most famous practical method is called "Golub-Kahan bidiagonalization.") In 18.06, we are content to let the svd
function be a "black box", much like eig
.
The fact that the singular values/vectors are related to eigenvalues of $A^H A$ and $A A^H$ has lots of important applications. Perhaps most famously, it means that the SVD diagonalizes the "covariance matrix" in statistics, which gives rise to the statistical method of principal component analysis (PCA).