Generative Classification¶
- [1] You have a machine that measures property $x$, the "orangeness" of liquids. You wish to discriminate between $C_1 = \text{`Fanta'}$ and $C_2 = \text{`Orangina'}$. It is known that
$$\begin{align*}
p(x|C_1) &= \begin{cases} 10 & 1.0 \leq x \leq 1.1\\
0 & \text{otherwise}
\end{cases}\\
p(x|C_2) &= \begin{cases} 200(x - 1) & 1.0 \leq x \leq 1.1\\
0 & \text{otherwise}
\end{cases}
\end{align*}$$
The prior probabilities $p(C_1) = 0.6$ and $p(C_2) = 0.4$ are also known from experience.
(a) (##) A "Bayes Classifier" is given by
$$ \text{Decision} = \begin{cases} C_1 & \text{if } p(C_1|x)>p(C_2|x) \\
C_2 & \text{otherwise}
\end{cases}
$$
Derive the optimal Bayes classifier.
(b) (###) The probability of making the wrong decision, given $x$, is
$$
p(\text{error}|x)= \begin{cases} p(C_1|x) & \text{if we decide $C_2$}\\
p(C_2|x) & \text{if we decide $C_1$}
\end{cases}
$$
Compute the total error probability $p(\text{error})$ for the Bayes classifier in this example.
- [2] (#) (see Bishop exercise 4.8): Using (4.57) and (4.58) (from Bishop's book), derive the result (4.65) for the posterior class probability in the two-class generative model with Gaussian densities, and verify the results (4.66) and (4.67) for the parameters $w$ and $w0$.
- [3] (###) (see Bishop exercise 4.9).
- [4] (##) (see Bishop exercise 4.10).