# Exercise 4.3¶

## Classification¶

In the following tasks, we will repeatedly use some basic functions (e.g., the softmax function or the cross-entropy) of the Keras Library. To familiarize with them, we will implement the most important of them ourselves in this task.

Suppose we want to classify some data (4 samples) into 3 distinct classes: 0, 1, and 2. We have set up a network with a pre-activation output z in the last layer. Applying softmax will give the final model output.

input X ---> some network --> z
--> y_model = softmax(z)

We quantify the agreement between truth (y) and model using categorical cross-entropy.

$$J = - \sum_i (y_i * \log(y_\mathrm{model}(x_i))$$

In the following you are to implement softmax and categorical cross-entropy and evaluate them values given the values for z.

In [1]:
import numpy as np

##### Data: 4 samples with the following class labels (input features X irrelevant here)¶
In [2]:
y_cl = np.array([0, 0, 2, 1])

##### output of the last network layer before applying softmax¶
In [3]:
z = np.array([
[4,    5,    1],
[-1,  -2,   -3],
[0.1, 0.2, 0.3],
[-1,  17,    1]
]).astype(np.float32)


Write a function that turns any class labels y_cl into one-hot encodings y.

0 --> (1, 0, 0)

1 --> (0, 1, 0)

2 --> (0, 0, 1)

Make sure that np.shape(y) = (4, 3) for np.shape(y_cl) = (4).

In [ ]:



Write a function that returns the softmax of the input z along the last axis

In [ ]:



Compute the categorical cross-entropy between data and model

In [ ]:



Determine which calsses are predicted by the model (maximum prediction)

In [ ]: