The formula for sample variance: $$s_n^2 = \frac{1}{n-1}\sum (x_i-\bar{x})^2$$ has that funny $n-1$ in the denominator.
The n-1 is referred to as Bessel's correction. The usual explanation involves vague terms such as [degrees of freedom](https://en.wikipedia.org/wiki/Degrees_of_freedom_(statistics%29) which always sounded flaky to me.
function f(n)
x = randn(n)
norm(x-mean(x))^2
end
f (generic function with 1 method)
n=11
mean([f(n) for i=1:1_000_000])
10.00378620254928
n=5
mean([f(n) for i=1:1_000_000])
3.9965121482424095
randn(n) is an n-vector of independent standard normals.
If Q is any orthgonal matrix, $Q*$randn(n) is also an n-vector of independent standard normals. There is no mathematical way to distinguish randn(n) from $Q*$randn(n). This is because the probability function is proportional to $e^{-\|x\|^2/2}$, i.e., it only depends on the length of x, not the direction.
Also the expected value of randn(1)^2 is 1.
Consider the projection matrix $P=I-1/n$. The matrix-vector product $Px$ computes x-mean(x).
# example
n = 4
P = eye(Int,n) - 1//n
4×4 Array{Rational{Int64},2}: 3//4 -1//4 -1//4 -1//4 -1//4 3//4 -1//4 -1//4 -1//4 -1//4 3//4 -1//4 -1//4 -1//4 -1//4 3//4
If we write the eigendecomposition $P=Q\Lambda Q'$, then $\Lambda$ has one diagonal entry (say the first) $0$ and the rest $1$.