## Week 5: Introduction to neural Networks¶

### Perceptron learning rule¶

This week, we will start working with neural networks. For each of the exercises below you can use the method of your choice but you should display the final boundary of your classifier.

#### Exercise 1.¶

As a first exercise, load the binary dataset below and code a few steps of the perceptron learning rule.

In [ ]:
import scipy.io as sio

data1 = data1['perceptron_data_class1']
data2 = data2['perceptron_data_class2']



#### Exercise 2.¶

2a. Load the data below. Using the neural_network module from scikit-learn and its MLPClassifier model, learn a classifier, for the dataset below using

• One hidden layer with a linear activation function and
• One neuron
• Two neurons
• One hidden layer with a non linear activation function (take Relu for example or a binary step)
• One neuron
• Two neurons

How many neurons, hidden layers do you need to learn the distribution of the data? Do you have an idea why?

Try increasing the number of neurons and hidden layers. Then try different values of the learning rate.

In [ ]:
import scipy.io as sio

data1 = data1['neural_net_class1']
data2 = data2['neural_net_class2']

from sklearn.neural_network import MLPClassifier



2b. Keep the dataset from above. try to change the intialization of the training algorithm. Plot the resulting classifier for a couple of different initializations. What do you see?

Do it for a small network first. Then repeat those experiments for larger architectures. I.e. increase the number of neurons and the number of layers. What do you see when you change the initialization?

In [ ]:
# put your code here


#### Exercise 3.¶

3a.Load the data below. Try to build the best neural network you can for this dataset. Split the data between a training and a test set and evaluate the models you built. What is the best validation error you can get?

In [ ]:
import scipy.io as sio

data1 = data1['neural_net_ex2_class1']
data2 = data2['neural_net_ex2_class2']

from sklearn.neural_network import MLPClassifier


3b. With the same dataset, add additional features to your model, e.g. $\sin(x), \sin(y)$ or other monomials. Can you improve your classifier ?
# put your code here