If we encounter a zero pivot (or even just a small pivot, on a computer) during Gaussian elimination, we normally swap rows to bring a nonzero pivot up from a subsequent row. However, what if there are no nonzero values below the pivot in that column? This is called a singular matrix: we can still proceed with Gaussian elimination, but we can't get rid of the zero pivot.
If you have $Ax=b$ where $A$ is singular, then there will typically (for most right-hand sides $b$) be no solutions, but there will occasionally (for very special $b$) be infinitely many solutions. (For $2 \times 2$ matrices, solving $Ax=b$ corresponds to finding the intersection of two lines, and a singular case corresponds to two parallel lines — either there are no intersections, or they intersect everywhere.)
For example, consider the following $4 \times 4$ matrix $A=LU$:
$$ \underbrace{\begin{pmatrix} 2 & -1 & 0 & 3 \\ 4 & -1 & 1 & 8 \\ 6 & 1 & 4 & 15 \\ 2 & -1 & 0 & 0 \\ \end{pmatrix}}_A = \underbrace{\begin{pmatrix} 1 & 0 & 0 & 0 \\ 2 & 1 & 0 & 0 \\ 3 & 4 & 1 & 0 \\ 1 & 0 & 2 & 1 \\ \end{pmatrix}}_L \underbrace{\begin{pmatrix} \color{blue}{2} & -1 & 0 & 3 \\ 0 & \color{blue}{1} & 1 & 2 \\ 0 & 0 & \color{red}{0} & \color{blue}{-2} \\ 0 & 0 & 0 & 1 \\ \end{pmatrix}}_U $$In the third column, we got zeros where we were hoping for a pivot. So, we only have three pivots (blue) in this case. Now, suppose we want to solve $Ax=b$. We first solve $Lc=b$ to apply the elimination steps to $b$. This is no problem since $L$ has 1's along the diagonal. Suppose we get $c = (c_1, c_2, c_3, c_4)$. Then we proceed by backsubstitution to solve $Ux = c$, starting with the last row of $U$:
$$ 1 \times x_4 = c_4 \implies x_4 = c_4 \\ \color{red}{0 \times x_3} - 2 \times x_4 = c_3 \implies \mbox{no solution unless } -2 x_4 = -2 c_4 = c_3 $$For very special right-hand sides, where $c_3 = 2c_4$, we can plug in any $x_3$ and get a solution (infinitely many solutions). Otherwise, we get no solutions.
[1 0 0 0
2 1 0 0
3 4 1 0
1 0 2 1 ] *
[2 -1 0 3
0 1 1 2
0 0 0 -2
0 0 0 1 ]
4×4 Matrix{Int64}: 2 -1 0 3 4 -1 1 8 6 1 4 15 2 -1 0 0
You may think that singular cases are not very interesting. In reality, exactly singular square matrices never occur by accident. There is always some deep structure of the underlying problem that causes the singularity, and understanding this structure is always interesting.
On the other hand, nearly singular matrices (where the pivots are nonzero but very small) can occur by accident, and dealing with them is often a delicate problem because they are very sensitive to roundoff errors. (We call these matrices ill-conditioned.) But that's mostly not a topic for 18.06.
Singular non-square systems, where you have more equations than unknowns are very common and important, and lead to fitting problems where one minimizes the error in the solution. We will talk more about this soon in 18.06.
Some matrices are more singular than others. For example, they can have two pivots:
$$ \underbrace{\begin{pmatrix} 2 & -1 & 0 & 3 \\ 4 & -2 & 1 & 8 \\ 6 & 3 & 4 & 17 \\ 2 & -1 & 0 & 3 \\ \end{pmatrix}}_A = \underbrace{\begin{pmatrix} 1 & 0 & 0 & 0 \\ 2 & 1 & 0 & 0 \\ 3 & 4 & 1 & 0 \\ 1 & 0 & 2 & 1 \\ \end{pmatrix}}_L \underbrace{\begin{pmatrix} \color{blue}{2} & -1 & 0 & 3 \\ 0 & 0 & \color{blue}{1} & 2 \\ 0 & 0 & \color{red}{0} & \color{red}{0} \\ 0 & 0 & 0 & \color{red}{0} \\ \end{pmatrix}}_U $$or one pivot:
$$ \underbrace{\begin{pmatrix} 2 & -1 & 0 & 3 \\ 4 & -2 & 0 & 6 \\ 6 & 3 & 0 & 9 \\ 2 & -1 & 0 & 3 \\ \end{pmatrix}}_A = \underbrace{\begin{pmatrix} 1 & 0 & 0 & 0 \\ 2 & 1 & 0 & 0 \\ 3 & 4 & 1 & 0 \\ 1 & 0 & 2 & 1 \\ \end{pmatrix}}_L \underbrace{\begin{pmatrix} \color{blue}{2} & -1 & 0 & 3 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \end{pmatrix}}_U $$or zero pivots:
$$ \underbrace{\begin{pmatrix} 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \end{pmatrix}}_A = \underbrace{\begin{pmatrix} 1 & 0 & 0 & 0 \\ 2 & 1 & 0 & 0 \\ 3 & 4 & 1 & 0 \\ 1 & 0 & 2 & 1 \\ \end{pmatrix}}_L \underbrace{\begin{pmatrix} 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \end{pmatrix}}_U $$If $A$ is the zero matrix, then $Ax=b$ only has solutions when $b=0$, and then any $x$ is a solution!
Intuitively, having fewer pivots seems "more singular", and requires "more coincidences" in the right-hand side to have a solution, and has a "bigger infinity" of solutions when there is a solution. We will quantify intuitions in 18.06, starting with the notion of the [rank](https://en.wikipedia.org/wiki/Rank_(linear_algebra) of a matrix.
The rank = r of the matrix is the number of (nonzero) pivots obtained by elimination (with row swaps if needed) for an $m \times n$ matrix $A$.
$r \le m$ and $r \le n$ because you can't have more pivots than you have rows or columns.
The smaller the rank is compared to the size of the matrix, the "more singular" it is. Pretty soon we will understand this better.
[1 0 0 0
2 1 0 0
3 4 1 0
1 0 2 1 ] *
[2 -1 0 3
0 0 1 2
0 0 0 0
0 0 0 0 ]
4×4 Matrix{Int64}: 2 -1 0 3 4 -2 1 8 6 -3 4 17 2 -1 0 3
[1 0 0 0
2 1 0 0
3 4 1 0
1 0 2 1 ] *
[2 -1 0 3
0 0 0 0
0 0 0 0
0 0 0 0 ]
4×4 Matrix{Int64}: 2 -1 0 3 4 -2 0 6 6 -3 0 9 2 -1 0 3
Note that if we encounter zeros in a column where we were hoping for a pivot, and we can't get a nonzero element by swapping rows, we skip to the next column. The following example is rank 2, not rank 0:
$$ \underbrace{\begin{pmatrix} 0 & -1 & 0 & 3 \\ 0 & -2 & 0 & 8 \\ 0 & 3 & 0 & 17 \\ 0 & -1 & 0 & 3 \\ \end{pmatrix}}_A = \underbrace{\begin{pmatrix} 1 & 0 & 0 & 0 \\ 2 & 1 & 0 & 0 \\ 3 & 4 & 1 & 0 \\ 1 & 0 & 2 & 1 \\ \end{pmatrix}}_L \underbrace{\begin{pmatrix} 0 & \color{blue}{-1} & 0 & 3 \\ 0 & 0 & 0 & \color{blue}{2} \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ \end{pmatrix}}_U $$That is, if we encounter all zeros in a column where we were hoping for a pivot, we skip to the next column for our pivot and continue eliminating below the pivots.
[1 0 0 0
2 1 0 0
3 4 1 0
1 0 2 1 ] *
[0 -1 0 3
0 0 0 2
0 0 0 0
0 0 0 0 ]
4×4 Matrix{Int64}: 0 -1 0 3 0 -2 0 8 0 -3 0 17 0 -1 0 3
Much of the material in the second part of 18.06 (somewhat in exam 1, but especially in exam 2) will be focused on how we understand singular and non-square systems of equations.
It turns out that there are lots of interesting things to say and do about systems of equations that may not have solutions. We don't just give up!