به نام خدا

تمرین عملی 1: طبقه بندی با شبکه های تمام متصل روی مجموعه داده IRIS

صورت مساله

در اولین جلسه کارگاه طبقه بندی با شبکه های تمام متصل را دیدیم.
توصیه می‌شود حتما نوت بوک‌های زیر را قبل از این تمرین مرور کنید:

04_a Gentle Introduction to Keras - Simple neural network(mlp).ipynb

05_Dropout.ipynb

در این جلسه با داده های تصویری آشنا شدیم. اما در این تمرین برای اینکه بدانیم کاربرد این مباحث در مسائل غیر تصویری نیز هست از مجموعه داده ی ساختار یافتهiris شامل 4 ویژگی برای طول و عرض کاسبرگ و گلبرگ استفاده خواهیم کرد که بتوانیم بر اساس این ویژگی ها نوع گل را از 3 کلاس متفاوت تشخیص دهیم.

لود کتابخانه های مورد نیاز

کتابخانه های مورد نیاز این تمرین لود شده اند
در صورت نیاز میتوانید کتابخانه های بیشتری لود کنید:
In [1]:
import numpy as np
from tensorflow import keras
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from keras.models import Sequential
from keras.layers import Dense, Dropout
from tensorflow.keras.optimizers import Adam
Using TensorFlow backend.
در این تمرین میخواهیم از مجموعه داده iris استفاده کنیم.
توضیحات این مجموعه داده در سایت آن موجود است:

https://archive.ics.uci.edu/ml/datasets/iris

ویژگی ها و کلاس های این مجموعه داده به شرح زیر است:

Attribute Information:

  1. sepal length in cm
  2. sepal width in cm
  3. petal length in cm
  4. petal width in cm

class:

Iris Setosa
Iris Versicolour
Iris Virginica

این دیتاست در کتابخانه sklearn موجود است
در قطعه کد زیر ویژگی ها را در x و برچسب یا labelهای متناظر را در y لود شده است.
In [2]:
iris_data = load_iris() # load the iris dataset
x = iris_data.data
y = iris_data.target.reshape(-1, 1) # Convert data to a single column

سوال 1:

برچسب یا label های ما در حال حاضر عددی است.
این اعداد 0 تا 2 هستند و به عبارتی 3 حالت مختلف دارند.
این برچسب ها را به فرمت one-hot تبدیل کنید و خروجی را مجدد در y بریزید.
راهنمایی: از تابع keras.utils.to_categorical استفاده کنید.
In [3]:
y = 
در زیر داده ها به داده های test و train تقسیم شده است:
In [4]:
# Split the data for training and testing
train_x, test_x, train_y, test_y = train_test_split(x, y, test_size=0.20)

سوال 2:

یک شبکه با دو hidden-layer در هر لایه 10 نوران و تابع فعالیت relu بسازید. یک لایه Dropout با نرخ 0.5 در لایه آخر ماقبل softmax نیز اضافه کنید.
In [5]:
# Build the model
In [6]:
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 10)                50        
_________________________________________________________________
dense_2 (Dense)              (None, 10)                110       
_________________________________________________________________
dropout_1 (Dropout)          (None, 10)                0         
_________________________________________________________________
dense_3 (Dense)              (None, 3)                 33        
=================================================================
Total params: 193
Trainable params: 193
Non-trainable params: 0
_________________________________________________________________
در زیر مدل کامپایل شده است.
In [7]:
# Adam optimizer with learning rate of 0.001
optimizer = Adam(lr=0.001)
model.compile(optimizer, loss='categorical_crossentropy', metrics=['accuracy'])

سوال 3:

مدل را با batch_size=5 و تعداد 200 ایپاک آموزش دهید.
راهنمایی: از تابع model.fit استفاده کنید.
In [8]:
# Train the model
Epoch 1/200
 - 3s - loss: 2.0045 - acc: 0.4333
Epoch 2/200
 - 0s - loss: 1.5714 - acc: 0.4167
Epoch 3/200
 - 0s - loss: 1.5014 - acc: 0.3250
Epoch 4/200
 - 0s - loss: 1.3361 - acc: 0.3000
Epoch 5/200
 - 0s - loss: 1.1968 - acc: 0.3250
Epoch 6/200
 - 0s - loss: 1.0811 - acc: 0.4000
Epoch 7/200
 - 0s - loss: 1.0982 - acc: 0.4000
Epoch 8/200
 - 0s - loss: 0.9471 - acc: 0.6167
Epoch 9/200
 - 0s - loss: 0.8981 - acc: 0.6500
Epoch 10/200
 - 0s - loss: 0.9414 - acc: 0.5167
Epoch 11/200
 - 0s - loss: 0.9232 - acc: 0.4583
Epoch 12/200
 - 0s - loss: 0.8913 - acc: 0.5167
Epoch 13/200
 - 0s - loss: 0.8562 - acc: 0.5000
Epoch 14/200
 - 0s - loss: 0.8648 - acc: 0.5500
Epoch 15/200
 - 0s - loss: 0.8327 - acc: 0.5250
Epoch 16/200
 - 0s - loss: 0.7939 - acc: 0.6250
Epoch 17/200
 - 0s - loss: 0.7874 - acc: 0.6000
Epoch 18/200
 - 0s - loss: 0.7900 - acc: 0.6083
Epoch 19/200
 - 0s - loss: 0.7769 - acc: 0.6000
Epoch 20/200
 - 0s - loss: 0.7954 - acc: 0.5833
Epoch 21/200
 - 0s - loss: 0.7657 - acc: 0.5833
Epoch 22/200
 - 0s - loss: 0.7477 - acc: 0.5917
Epoch 23/200
 - 0s - loss: 0.7309 - acc: 0.6250
Epoch 24/200
 - 0s - loss: 0.7432 - acc: 0.6167
Epoch 25/200
 - 0s - loss: 0.7311 - acc: 0.6333
Epoch 26/200
 - 0s - loss: 0.7539 - acc: 0.5583
Epoch 27/200
 - 0s - loss: 0.7399 - acc: 0.5833
Epoch 28/200
 - 0s - loss: 0.7139 - acc: 0.5667
Epoch 29/200
 - 0s - loss: 0.6785 - acc: 0.7000
Epoch 30/200
 - 0s - loss: 0.6954 - acc: 0.7500
Epoch 31/200
 - 0s - loss: 0.7436 - acc: 0.7167
Epoch 32/200
 - 0s - loss: 0.7036 - acc: 0.7667
Epoch 33/200
 - 0s - loss: 0.6533 - acc: 0.7417
Epoch 34/200
 - 0s - loss: 0.6829 - acc: 0.7417
Epoch 35/200
 - 0s - loss: 0.6689 - acc: 0.7000
Epoch 36/200
 - 0s - loss: 0.6955 - acc: 0.7667
Epoch 37/200
 - 0s - loss: 0.6159 - acc: 0.7167
Epoch 38/200
 - 0s - loss: 0.7267 - acc: 0.7083
Epoch 39/200
 - 0s - loss: 0.6675 - acc: 0.7083
Epoch 40/200
 - 0s - loss: 0.6226 - acc: 0.8000
Epoch 41/200
 - 0s - loss: 0.6601 - acc: 0.7917
Epoch 42/200
 - 0s - loss: 0.6203 - acc: 0.8000
Epoch 43/200
 - 0s - loss: 0.6188 - acc: 0.7833
Epoch 44/200
 - 0s - loss: 0.5869 - acc: 0.8250
Epoch 45/200
 - 0s - loss: 0.6272 - acc: 0.7250
Epoch 46/200
 - 0s - loss: 0.5770 - acc: 0.8083
Epoch 47/200
 - 0s - loss: 0.5503 - acc: 0.8417
Epoch 48/200
 - 0s - loss: 0.5849 - acc: 0.8167
Epoch 49/200
 - 0s - loss: 0.6534 - acc: 0.7500
Epoch 50/200
 - 0s - loss: 0.5793 - acc: 0.8083
Epoch 51/200
 - 0s - loss: 0.5877 - acc: 0.8083
Epoch 52/200
 - 0s - loss: 0.5309 - acc: 0.8000
Epoch 53/200
 - 0s - loss: 0.5703 - acc: 0.8417
Epoch 54/200
 - 0s - loss: 0.5586 - acc: 0.7917
Epoch 55/200
 - 0s - loss: 0.5718 - acc: 0.8083
Epoch 56/200
 - 0s - loss: 0.5435 - acc: 0.8583
Epoch 57/200
 - 0s - loss: 0.5451 - acc: 0.8083
Epoch 58/200
 - 0s - loss: 0.5508 - acc: 0.7750
Epoch 59/200
 - 0s - loss: 0.5631 - acc: 0.8167
Epoch 60/200
 - 0s - loss: 0.5593 - acc: 0.8000
Epoch 61/200
 - 0s - loss: 0.5572 - acc: 0.7917
Epoch 62/200
 - 0s - loss: 0.4763 - acc: 0.8333
Epoch 63/200
 - 0s - loss: 0.5809 - acc: 0.7750
Epoch 64/200
 - 0s - loss: 0.5441 - acc: 0.8500
Epoch 65/200
 - 0s - loss: 0.5590 - acc: 0.8250
Epoch 66/200
 - 0s - loss: 0.6090 - acc: 0.7667
Epoch 67/200
 - 0s - loss: 0.4742 - acc: 0.8583
Epoch 68/200
 - 0s - loss: 0.5401 - acc: 0.8250
Epoch 69/200
 - 0s - loss: 0.5127 - acc: 0.8417
Epoch 70/200
 - 0s - loss: 0.5533 - acc: 0.7917
Epoch 71/200
 - 0s - loss: 0.4702 - acc: 0.8500
Epoch 72/200
 - 0s - loss: 0.5168 - acc: 0.8333
Epoch 73/200
 - 0s - loss: 0.4613 - acc: 0.8917
Epoch 74/200
 - 0s - loss: 0.4579 - acc: 0.9250
Epoch 75/200
 - 0s - loss: 0.4944 - acc: 0.8333
Epoch 76/200
 - 0s - loss: 0.5239 - acc: 0.7917
Epoch 77/200
 - 0s - loss: 0.4926 - acc: 0.8167
Epoch 78/200
 - 0s - loss: 0.5196 - acc: 0.8250
Epoch 79/200
 - 0s - loss: 0.5318 - acc: 0.8167
Epoch 80/200
 - 0s - loss: 0.5148 - acc: 0.8083
Epoch 81/200
 - 0s - loss: 0.5011 - acc: 0.8250
Epoch 82/200
 - 0s - loss: 0.4435 - acc: 0.8333
Epoch 83/200
 - 0s - loss: 0.5057 - acc: 0.8167
Epoch 84/200
 - 0s - loss: 0.4395 - acc: 0.8583
Epoch 85/200
 - 0s - loss: 0.4901 - acc: 0.7917
Epoch 86/200
 - 0s - loss: 0.4870 - acc: 0.8583
Epoch 87/200
 - 0s - loss: 0.4945 - acc: 0.8333
Epoch 88/200
 - 0s - loss: 0.4622 - acc: 0.8667
Epoch 89/200
 - 0s - loss: 0.4370 - acc: 0.8500
Epoch 90/200
 - 0s - loss: 0.4483 - acc: 0.8417
Epoch 91/200
 - 0s - loss: 0.4978 - acc: 0.8417
Epoch 92/200
 - 0s - loss: 0.4989 - acc: 0.8250
Epoch 93/200
 - 0s - loss: 0.4359 - acc: 0.8667
Epoch 94/200
 - 0s - loss: 0.4359 - acc: 0.8917
Epoch 95/200
 - 0s - loss: 0.4337 - acc: 0.8917
Epoch 96/200
 - 0s - loss: 0.4613 - acc: 0.8500
Epoch 97/200
 - 0s - loss: 0.4463 - acc: 0.8750
Epoch 98/200
 - 0s - loss: 0.4455 - acc: 0.8750
Epoch 99/200
 - 0s - loss: 0.4267 - acc: 0.8333
Epoch 100/200
 - 0s - loss: 0.4495 - acc: 0.8667
Epoch 101/200
 - 0s - loss: 0.4648 - acc: 0.8750
Epoch 102/200
 - 0s - loss: 0.4427 - acc: 0.8583
Epoch 103/200
 - 0s - loss: 0.4352 - acc: 0.8583
Epoch 104/200
 - 0s - loss: 0.5087 - acc: 0.8250
Epoch 105/200
 - 0s - loss: 0.4342 - acc: 0.8667
Epoch 106/200
 - 0s - loss: 0.4546 - acc: 0.8583
Epoch 107/200
 - 0s - loss: 0.3864 - acc: 0.8917
Epoch 108/200
 - 0s - loss: 0.4458 - acc: 0.8500
Epoch 109/200
 - 0s - loss: 0.4564 - acc: 0.8167
Epoch 110/200
 - 0s - loss: 0.4129 - acc: 0.8833
Epoch 111/200
 - 0s - loss: 0.4641 - acc: 0.8333
Epoch 112/200
 - 0s - loss: 0.4221 - acc: 0.8833
Epoch 113/200
 - 0s - loss: 0.4229 - acc: 0.8667
Epoch 114/200
 - 0s - loss: 0.4429 - acc: 0.8417
Epoch 115/200
 - 0s - loss: 0.4266 - acc: 0.8583
Epoch 116/200
 - 0s - loss: 0.4213 - acc: 0.8583
Epoch 117/200
 - 0s - loss: 0.4472 - acc: 0.8583
Epoch 118/200
 - 0s - loss: 0.4268 - acc: 0.8750
Epoch 119/200
 - 0s - loss: 0.4736 - acc: 0.8167
Epoch 120/200
 - 0s - loss: 0.4782 - acc: 0.8250
Epoch 121/200
 - 0s - loss: 0.4819 - acc: 0.8500
Epoch 122/200
 - 0s - loss: 0.3892 - acc: 0.8750
Epoch 123/200
 - 0s - loss: 0.4835 - acc: 0.8250
Epoch 124/200
 - 0s - loss: 0.4524 - acc: 0.8500
Epoch 125/200
 - 0s - loss: 0.4032 - acc: 0.8833
Epoch 126/200
 - 0s - loss: 0.4005 - acc: 0.8667
Epoch 127/200
 - 0s - loss: 0.4052 - acc: 0.8500
Epoch 128/200
 - 0s - loss: 0.4205 - acc: 0.8750
Epoch 129/200
 - 0s - loss: 0.3842 - acc: 0.8917
Epoch 130/200
 - 0s - loss: 0.5378 - acc: 0.8083
Epoch 131/200
 - 0s - loss: 0.4194 - acc: 0.8583
Epoch 132/200
 - 0s - loss: 0.3729 - acc: 0.8917
Epoch 133/200
 - 0s - loss: 0.3878 - acc: 0.8583
Epoch 134/200
 - 0s - loss: 0.4458 - acc: 0.8667
Epoch 135/200
 - 0s - loss: 0.3684 - acc: 0.8500
Epoch 136/200
 - 0s - loss: 0.4178 - acc: 0.8583
Epoch 137/200
 - 0s - loss: 0.4331 - acc: 0.8583
Epoch 138/200
 - 0s - loss: 0.4115 - acc: 0.8500
Epoch 139/200
 - 0s - loss: 0.4356 - acc: 0.8417
Epoch 140/200
 - 0s - loss: 0.4231 - acc: 0.8583
Epoch 141/200
 - 0s - loss: 0.4416 - acc: 0.8583
Epoch 142/200
 - 0s - loss: 0.4678 - acc: 0.8333
Epoch 143/200
 - 0s - loss: 0.4221 - acc: 0.8250
Epoch 144/200
 - 0s - loss: 0.4261 - acc: 0.8583
Epoch 145/200
 - 0s - loss: 0.4098 - acc: 0.8667
Epoch 146/200
 - 0s - loss: 0.3533 - acc: 0.8833
Epoch 147/200
 - 0s - loss: 0.4322 - acc: 0.8167
Epoch 148/200
 - 0s - loss: 0.3751 - acc: 0.8917
Epoch 149/200
 - 0s - loss: 0.3298 - acc: 0.9167
Epoch 150/200
 - 0s - loss: 0.3649 - acc: 0.8917
Epoch 151/200
 - 0s - loss: 0.4046 - acc: 0.8917
Epoch 152/200
 - 0s - loss: 0.3729 - acc: 0.8917
Epoch 153/200
 - 0s - loss: 0.3758 - acc: 0.8750
Epoch 154/200
 - 0s - loss: 0.3539 - acc: 0.8833
Epoch 155/200
 - 0s - loss: 0.4478 - acc: 0.8750
Epoch 156/200
 - 0s - loss: 0.4120 - acc: 0.8667
Epoch 157/200
 - 0s - loss: 0.3739 - acc: 0.8583
Epoch 158/200
 - 0s - loss: 0.3729 - acc: 0.8917
Epoch 159/200
 - 0s - loss: 0.3847 - acc: 0.8750
Epoch 160/200
 - 0s - loss: 0.3863 - acc: 0.8667
Epoch 161/200
 - 0s - loss: 0.4010 - acc: 0.8750
Epoch 162/200
 - 0s - loss: 0.3997 - acc: 0.8583
Epoch 163/200
 - 0s - loss: 0.3947 - acc: 0.8583
Epoch 164/200
 - 0s - loss: 0.3865 - acc: 0.8917
Epoch 165/200
 - 0s - loss: 0.3765 - acc: 0.8583
Epoch 166/200
 - 0s - loss: 0.3452 - acc: 0.9083
Epoch 167/200
 - 0s - loss: 0.3571 - acc: 0.8583
Epoch 168/200
 - 0s - loss: 0.4701 - acc: 0.8250
Epoch 169/200
 - 0s - loss: 0.4322 - acc: 0.8583
Epoch 170/200
 - 0s - loss: 0.4075 - acc: 0.8500
Epoch 171/200
 - 0s - loss: 0.4693 - acc: 0.8417
Epoch 172/200
 - 0s - loss: 0.3888 - acc: 0.9000
Epoch 173/200
 - 0s - loss: 0.3834 - acc: 0.8750
Epoch 174/200
 - 0s - loss: 0.3906 - acc: 0.8750
Epoch 175/200
 - 0s - loss: 0.4041 - acc: 0.8750
Epoch 176/200
 - 0s - loss: 0.3974 - acc: 0.8667
Epoch 177/200
 - 0s - loss: 0.3812 - acc: 0.8917
Epoch 178/200
 - 0s - loss: 0.4019 - acc: 0.8583
Epoch 179/200
 - 0s - loss: 0.4021 - acc: 0.8333
Epoch 180/200
 - 0s - loss: 0.5003 - acc: 0.8250
Epoch 181/200
 - 0s - loss: 0.3797 - acc: 0.8833
Epoch 182/200
 - 0s - loss: 0.3941 - acc: 0.8917
Epoch 183/200
 - 0s - loss: 0.3297 - acc: 0.9167
Epoch 184/200
 - 0s - loss: 0.4415 - acc: 0.8667
Epoch 185/200
 - 0s - loss: 0.3285 - acc: 0.9000
Epoch 186/200
 - 0s - loss: 0.3934 - acc: 0.8833
Epoch 187/200
 - 0s - loss: 0.3066 - acc: 0.9000
Epoch 188/200
 - 0s - loss: 0.3618 - acc: 0.8583
Epoch 189/200
 - 0s - loss: 0.3192 - acc: 0.9000
Epoch 190/200
 - 0s - loss: 0.3908 - acc: 0.8833
Epoch 191/200
 - 0s - loss: 0.4528 - acc: 0.8333
Epoch 192/200
 - 0s - loss: 0.3941 - acc: 0.8417
Epoch 193/200
 - 0s - loss: 0.4478 - acc: 0.8333
Epoch 194/200
 - 0s - loss: 0.4239 - acc: 0.8750
Epoch 195/200
 - 0s - loss: 0.3614 - acc: 0.8833
Epoch 196/200
 - 0s - loss: 0.4071 - acc: 0.8583
Epoch 197/200
 - 0s - loss: 0.3946 - acc: 0.8417
Epoch 198/200
 - 0s - loss: 0.3654 - acc: 0.8917
Epoch 199/200
 - 0s - loss: 0.3634 - acc: 0.8833
Epoch 200/200
 - 0s - loss: 0.3496 - acc: 0.8833
Out[8]:
<keras.callbacks.History at 0x1f6c588c940>

سوال 4:

مدل را روی داده های test ارزیابی کنید.
راهنمایی: از تابع model.evaluate استفاده کنید.
In [9]:
# Test on unseen data
results = model.evaluate(test_x, test_y)

print('Final test set loss: {:4f}'.format(results[0]))
print('Final test set accuracy: {:4f}'.format(results[1]))
30/30 [==============================] - 0s 2ms/step
Final test set loss: 0.155448
Final test set accuracy: 1.000000
دوره مقدماتی یادگیری عمیق
علیرضا اخوان پور
پنج شنبه، ۱۸ و ۲۵ بهمن ۱۳۹۷
Class.Vision - AkhavanPour.ir - GitHub