**[1]**(##) We are given an IID data set $D = \{x_1,x_2,\ldots,x_N\}$, where $x_n \in \mathbb{R}^M$. Let's assume that the data were drawn from a multivariate Gaussian (MVG),

```
```

(a) Derive the log-likelihood of the parameters for these data.

(b) Derive the maximum likelihood estimates for the mean $\mu$ and variance $\Sigma$ by setting the derivative of the log-likelihood to zero.

**[2]**(#) Shortly explain why the Gaussian distribution is often preferred as a prior distribution over other distributions with the same support?**[3]**(###) We make $N$ IID observations $D=\{x_1 \dots x_N\}$ and assume the following model

We assume that $\sigma$ has a known value and are interested in deriving an estimator for $A$ .

(a) Derive the Bayesian (posterior) estimate $p(A|D)$.

(b) (##) Derive the Maximum Likelihood estimate for $A$.

(c) Derive the MAP estimates for $A$.

(d) Now assume that we do not know the variance of the noise term? Describe the procedure for Bayesian estimation of both $A$ and $\sigma^2$ (No need to fully work out to closed-form estimates).

**[4]**(##) Proof that a linear transformation $z=Ax+b$ of a Gaussian variable $\mathcal{N}(x|\mu,\Sigma)$ is Gaussian distributed as

**[5]**(#) Given independent variables

$x \sim \mathcal{N}(\mu_x,\sigma_x^2)$ and $y \sim \mathcal{N}(\mu_y,\sigma_y^2)$, what is the PDF for $z = A\cdot(x -y) + b$?

**[6]**(###) Compute

In [ ]:

```
```