به نام خدا

مقدمه ای بر شبکه‌های عصبی کانولوشنالی
Convolutionl Neural Networks - CNN

مقدمه ای بر شبکه‌های عصبی کانولوشنالی(Convolutionl Neural Networks - CNN)

در ابتدا معماری شبکه را مشخص میکنیم.
به لایه های conv و pool دقت کنید.
قبل از اولین لایه Dense یا Fully Connected همیشه متد Flatten فراخوانی میشود تا نورون ها به صورت یک وکتور در بیایند.

اجرا روی Colab

اگر روی گوگل کولب اجرا میکنید این خطوط را از حالت کامنت خارج نمائید.
In [ ]:
#!wget https://raw.githubusercontent.com/Alireza-Akhavan/SRU-deeplearning-workshop/master/dataset.py
#!mkdir dataset
#!wget https://github.com/Alireza-Akhavan/SRU-deeplearning-workshop/raw/master/dataset/Data_hoda_full.mat -P dataset
In [3]:
import keras
from keras import layers
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
import numpy as np
from dataset import load_hoda
import matplotlib.pyplot as plt

model = Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu',
                        input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(layers.Dense(10, activation='softmax'))
Using TensorFlow backend.
نگاهی به تنسور وردی و خروجی هر لایه بیندازیم.
تصویر ورودی 28x28x3 بوده است
In [4]:
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_1 (Conv2D)            (None, 26, 26, 32)        320       
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 13, 13, 32)        0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 11, 11, 64)        18496     
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 5, 5, 64)          0         
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 3, 3, 64)          36928     
_________________________________________________________________
flatten_1 (Flatten)          (None, 576)               0         
_________________________________________________________________
dense_1 (Dense)              (None, 64)                36928     
_________________________________________________________________
dropout_1 (Dropout)          (None, 64)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 10)                650       
=================================================================
Total params: 93,322
Trainable params: 93,322
Non-trainable params: 0
_________________________________________________________________

کد یک شبکه کانولوشنالی و آموزش آن از ابتدا تا انتها بر روی مجموعه داده هدی

تصاویر مجموعه داده هدی در تابعی که قبلا نوشته ایم، load_hoda به صورت flat شده و یک وکتور در آمده اند.
در این فراخوانی طول و عرض تصاویر 28 قرار داده شده است، پس خروجی این تابع وکتورهای 784تایی است.
** دقت کنید که قبل از ورودی شبکه کانولوشنالی تصویر را به شکل اصلی خود یعنی 28x28 برگردانده ایم.**
همچنین چون تصاویر سیاه و سفید است تعداد کانال تصویر را 1 قرار داده ایم.
In [7]:
# 1. Import libraries and modules
import keras
from keras import layers
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
import numpy as np
from dataset import load_hoda
import matplotlib.pyplot as plt

np.random.seed(123)  # for reproducibility

# Load pre-shuffled HODA data into train and test sets
x_train_original, y_train_original, x_test_original, y_test_original = load_hoda(
                                                                        training_sample_size=3500,
                                                                        test_sample_size=400,size=28)

# Preprocess input data
''' 3.1: input data in numpy array format'''
x_train = np.array(x_train_original)
x_test = np.array(x_test_original)
'''3.2 normalize our data values to the range [0, 1]'''
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255

# Reshape to original image shape (n x 784)  ==> (n x 28 x 28 x 1)
x_train = x_train.reshape(-1,28,28,1)
x_test = x_test.reshape(-1,28,28,1)


# 4. Preprocess class labels
y_train = keras.utils.to_categorical(y_train_original, num_classes=10)
y_test = keras.utils.to_categorical(y_test_original, num_classes=10)


# test and validation set
x_val = x_test[:200]
x_test = x_test[200:]
y_val = y_test[:200]
y_test = y_test[200:]

# 5. Define model architecture
model = Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu',
                        input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(layers.Dense(10, activation='softmax'))


# 6. Compile model
model.compile(loss='categorical_crossentropy',
              optimizer='adam',
              metrics=['accuracy'])


# 7. Fit model on training data
history = model.fit(x_train, y_train,
          epochs=200, batch_size=256, validation_data = (x_val, y_val))
Train on 3500 samples, validate on 200 samples
Epoch 1/200
3500/3500 [==============================] - 1s 358us/step - loss: 2.0753 - acc: 0.3260 - val_loss: 1.3986 - val_acc: 0.7250
Epoch 2/200
3500/3500 [==============================] - 0s 123us/step - loss: 1.2037 - acc: 0.5846 - val_loss: 0.5824 - val_acc: 0.8150
Epoch 3/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.8154 - acc: 0.7123 - val_loss: 0.3902 - val_acc: 0.8550
Epoch 4/200
3500/3500 [==============================] - 1s 231us/step - loss: 0.6353 - acc: 0.7743 - val_loss: 0.2694 - val_acc: 0.9050
Epoch 5/200
3500/3500 [==============================] - 1s 175us/step - loss: 0.4933 - acc: 0.8286 - val_loss: 0.2339 - val_acc: 0.9100
Epoch 6/200
3500/3500 [==============================] - 1s 163us/step - loss: 0.4242 - acc: 0.8551 - val_loss: 0.2204 - val_acc: 0.9350
Epoch 7/200
3500/3500 [==============================] - 1s 167us/step - loss: 0.3744 - acc: 0.8740 - val_loss: 0.1964 - val_acc: 0.9350
Epoch 8/200
3500/3500 [==============================] - 0s 136us/step - loss: 0.3062 - acc: 0.8971 - val_loss: 0.1785 - val_acc: 0.9350
Epoch 9/200
3500/3500 [==============================] - 1s 159us/step - loss: 0.2947 - acc: 0.9046 - val_loss: 0.1695 - val_acc: 0.9350
Epoch 10/200
3500/3500 [==============================] - 0s 136us/step - loss: 0.2569 - acc: 0.9194 - val_loss: 0.1657 - val_acc: 0.9500
Epoch 11/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.2376 - acc: 0.9231 - val_loss: 0.1531 - val_acc: 0.9500
Epoch 12/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.2101 - acc: 0.9289 - val_loss: 0.1403 - val_acc: 0.9600
Epoch 13/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.2005 - acc: 0.9337 - val_loss: 0.1290 - val_acc: 0.9600
Epoch 14/200
3500/3500 [==============================] - 0s 118us/step - loss: 0.2244 - acc: 0.9271 - val_loss: 0.1185 - val_acc: 0.9650
Epoch 15/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.1925 - acc: 0.9403 - val_loss: 0.1215 - val_acc: 0.9650
Epoch 16/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.1830 - acc: 0.9409 - val_loss: 0.1321 - val_acc: 0.9650
Epoch 17/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.1564 - acc: 0.9486 - val_loss: 0.1129 - val_acc: 0.9650
Epoch 18/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.1567 - acc: 0.9514 - val_loss: 0.1066 - val_acc: 0.9700
Epoch 19/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.1437 - acc: 0.9543 - val_loss: 0.1178 - val_acc: 0.9600
Epoch 20/200
3500/3500 [==============================] - 1s 193us/step - loss: 0.1409 - acc: 0.9537 - val_loss: 0.1042 - val_acc: 0.9650
Epoch 21/200
3500/3500 [==============================] - 1s 155us/step - loss: 0.1368 - acc: 0.9566 - val_loss: 0.1029 - val_acc: 0.9600
Epoch 22/200
3500/3500 [==============================] - 1s 143us/step - loss: 0.1155 - acc: 0.9663 - val_loss: 0.1019 - val_acc: 0.9700
Epoch 23/200
3500/3500 [==============================] - 0s 133us/step - loss: 0.1226 - acc: 0.9586 - val_loss: 0.1234 - val_acc: 0.9750
Epoch 24/200
3500/3500 [==============================] - 0s 132us/step - loss: 0.1152 - acc: 0.9574 - val_loss: 0.1076 - val_acc: 0.9650
Epoch 25/200
3500/3500 [==============================] - 0s 134us/step - loss: 0.1018 - acc: 0.9666 - val_loss: 0.1147 - val_acc: 0.9750
Epoch 26/200
3500/3500 [==============================] - 0s 128us/step - loss: 0.1038 - acc: 0.9686 - val_loss: 0.1165 - val_acc: 0.9700
Epoch 27/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0957 - acc: 0.9669 - val_loss: 0.1179 - val_acc: 0.9700
Epoch 28/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0893 - acc: 0.9694 - val_loss: 0.1187 - val_acc: 0.9750
Epoch 29/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0857 - acc: 0.9691 - val_loss: 0.1097 - val_acc: 0.9800
Epoch 30/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0795 - acc: 0.9754 - val_loss: 0.1070 - val_acc: 0.9750
Epoch 31/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0759 - acc: 0.9760 - val_loss: 0.0983 - val_acc: 0.9750
Epoch 32/200
3500/3500 [==============================] - 0s 117us/step - loss: 0.0830 - acc: 0.9709 - val_loss: 0.0969 - val_acc: 0.9850
Epoch 33/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0787 - acc: 0.9714 - val_loss: 0.1184 - val_acc: 0.9700
Epoch 34/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0804 - acc: 0.9720 - val_loss: 0.1285 - val_acc: 0.9750
Epoch 35/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0815 - acc: 0.9706 - val_loss: 0.1288 - val_acc: 0.9800
Epoch 36/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0698 - acc: 0.9763 - val_loss: 0.1085 - val_acc: 0.9750
Epoch 37/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0665 - acc: 0.9771 - val_loss: 0.0895 - val_acc: 0.9800
Epoch 38/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0693 - acc: 0.9771 - val_loss: 0.1152 - val_acc: 0.9750
Epoch 39/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0699 - acc: 0.9757 - val_loss: 0.1296 - val_acc: 0.9750
Epoch 40/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0603 - acc: 0.9806 - val_loss: 0.1439 - val_acc: 0.9800
Epoch 41/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0616 - acc: 0.9786 - val_loss: 0.1343 - val_acc: 0.9750
Epoch 42/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0576 - acc: 0.9800 - val_loss: 0.1329 - val_acc: 0.9800
Epoch 43/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0504 - acc: 0.9811 - val_loss: 0.0930 - val_acc: 0.9850
Epoch 44/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0455 - acc: 0.9829 - val_loss: 0.1175 - val_acc: 0.9800
Epoch 45/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0473 - acc: 0.9846 - val_loss: 0.1063 - val_acc: 0.9850
Epoch 46/200
3500/3500 [==============================] - 0s 118us/step - loss: 0.0500 - acc: 0.9820 - val_loss: 0.0983 - val_acc: 0.9800
Epoch 47/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0461 - acc: 0.9829 - val_loss: 0.1131 - val_acc: 0.9800
Epoch 48/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0524 - acc: 0.9777 - val_loss: 0.1235 - val_acc: 0.9750
Epoch 49/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0508 - acc: 0.9826 - val_loss: 0.1160 - val_acc: 0.9750
Epoch 50/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0504 - acc: 0.9843 - val_loss: 0.1122 - val_acc: 0.9700
Epoch 51/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0445 - acc: 0.9854 - val_loss: 0.1422 - val_acc: 0.9750
Epoch 52/200
3500/3500 [==============================] - 0s 118us/step - loss: 0.0460 - acc: 0.9803 - val_loss: 0.0983 - val_acc: 0.9800
Epoch 53/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0452 - acc: 0.9820 - val_loss: 0.1183 - val_acc: 0.9750
Epoch 54/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0431 - acc: 0.9851 - val_loss: 0.1004 - val_acc: 0.9900
Epoch 55/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0344 - acc: 0.9891 - val_loss: 0.1376 - val_acc: 0.9700
Epoch 56/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0426 - acc: 0.9843 - val_loss: 0.1360 - val_acc: 0.9750
Epoch 57/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0413 - acc: 0.9849 - val_loss: 0.1211 - val_acc: 0.9700
Epoch 58/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0397 - acc: 0.9851 - val_loss: 0.1016 - val_acc: 0.9800
Epoch 59/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0314 - acc: 0.9900 - val_loss: 0.1242 - val_acc: 0.9800
Epoch 60/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0355 - acc: 0.9851 - val_loss: 0.1333 - val_acc: 0.9800
Epoch 61/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0342 - acc: 0.9883 - val_loss: 0.1125 - val_acc: 0.9850
Epoch 62/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0267 - acc: 0.9891 - val_loss: 0.1654 - val_acc: 0.9800
Epoch 63/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0265 - acc: 0.9906 - val_loss: 0.0978 - val_acc: 0.9800
Epoch 64/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0246 - acc: 0.9900 - val_loss: 0.1734 - val_acc: 0.9700
Epoch 65/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0371 - acc: 0.9871 - val_loss: 0.1400 - val_acc: 0.9800
Epoch 66/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0355 - acc: 0.9891 - val_loss: 0.1100 - val_acc: 0.9850
Epoch 67/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0338 - acc: 0.9866 - val_loss: 0.1232 - val_acc: 0.9700
Epoch 68/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0300 - acc: 0.9894 - val_loss: 0.1211 - val_acc: 0.9800
Epoch 69/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0284 - acc: 0.9903 - val_loss: 0.1034 - val_acc: 0.9800
Epoch 70/200
3500/3500 [==============================] - 0s 127us/step - loss: 0.0400 - acc: 0.9869 - val_loss: 0.1226 - val_acc: 0.9800
Epoch 71/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0348 - acc: 0.9869 - val_loss: 0.1385 - val_acc: 0.9750
Epoch 72/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0333 - acc: 0.9869 - val_loss: 0.1099 - val_acc: 0.9850
Epoch 73/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0324 - acc: 0.9886 - val_loss: 0.1174 - val_acc: 0.9850
Epoch 74/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0288 - acc: 0.9900 - val_loss: 0.1725 - val_acc: 0.9800
Epoch 75/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0197 - acc: 0.9934 - val_loss: 0.1549 - val_acc: 0.9800
Epoch 76/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0275 - acc: 0.9894 - val_loss: 0.1217 - val_acc: 0.9750
Epoch 77/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0287 - acc: 0.9877 - val_loss: 0.1105 - val_acc: 0.9800
Epoch 78/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0264 - acc: 0.9906 - val_loss: 0.1320 - val_acc: 0.9800
Epoch 79/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0279 - acc: 0.9894 - val_loss: 0.0662 - val_acc: 0.9900
Epoch 80/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0241 - acc: 0.9920 - val_loss: 0.1212 - val_acc: 0.9800
Epoch 81/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0273 - acc: 0.9914 - val_loss: 0.1132 - val_acc: 0.9800
Epoch 82/200
3500/3500 [==============================] - 0s 118us/step - loss: 0.0251 - acc: 0.9911 - val_loss: 0.1423 - val_acc: 0.9800
Epoch 83/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0244 - acc: 0.9917 - val_loss: 0.1216 - val_acc: 0.9750
Epoch 84/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0258 - acc: 0.9914 - val_loss: 0.1610 - val_acc: 0.9800
Epoch 85/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0238 - acc: 0.9923 - val_loss: 0.0945 - val_acc: 0.9850
Epoch 86/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0244 - acc: 0.9903 - val_loss: 0.0870 - val_acc: 0.9850
Epoch 87/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0171 - acc: 0.9937 - val_loss: 0.1556 - val_acc: 0.9800
Epoch 88/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0218 - acc: 0.9929 - val_loss: 0.1073 - val_acc: 0.9800
Epoch 89/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0177 - acc: 0.9940 - val_loss: 0.1171 - val_acc: 0.9850
Epoch 90/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0216 - acc: 0.9931 - val_loss: 0.1612 - val_acc: 0.9800
Epoch 91/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0241 - acc: 0.9914 - val_loss: 0.1747 - val_acc: 0.9700
Epoch 92/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.1530 - val_acc: 0.9800
Epoch 93/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0249 - acc: 0.9914 - val_loss: 0.1463 - val_acc: 0.9800
Epoch 94/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0216 - acc: 0.9914 - val_loss: 0.1304 - val_acc: 0.9750
Epoch 95/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0242 - acc: 0.9931 - val_loss: 0.1448 - val_acc: 0.9800
Epoch 96/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0240 - acc: 0.9920 - val_loss: 0.0894 - val_acc: 0.9900
Epoch 97/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0232 - acc: 0.9920 - val_loss: 0.1269 - val_acc: 0.9800
Epoch 98/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0210 - acc: 0.9917 - val_loss: 0.1371 - val_acc: 0.9800
Epoch 99/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0205 - acc: 0.9931 - val_loss: 0.1492 - val_acc: 0.9800
Epoch 100/200
3500/3500 [==============================] - 0s 118us/step - loss: 0.0150 - acc: 0.9943 - val_loss: 0.1428 - val_acc: 0.9750
Epoch 101/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0186 - acc: 0.9934 - val_loss: 0.1229 - val_acc: 0.9850
Epoch 102/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0171 - acc: 0.9946 - val_loss: 0.1371 - val_acc: 0.9800
Epoch 103/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0206 - acc: 0.9917 - val_loss: 0.1073 - val_acc: 0.9850
Epoch 104/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0204 - acc: 0.9926 - val_loss: 0.1407 - val_acc: 0.9800
Epoch 105/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0168 - acc: 0.9949 - val_loss: 0.1542 - val_acc: 0.9800
Epoch 106/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0166 - acc: 0.9943 - val_loss: 0.1249 - val_acc: 0.9850
Epoch 107/200
3500/3500 [==============================] - 0s 127us/step - loss: 0.0181 - acc: 0.9929 - val_loss: 0.1230 - val_acc: 0.9850
Epoch 108/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0167 - acc: 0.9934 - val_loss: 0.1177 - val_acc: 0.9850
Epoch 109/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0139 - acc: 0.9943 - val_loss: 0.2271 - val_acc: 0.9750
Epoch 110/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0167 - acc: 0.9940 - val_loss: 0.1298 - val_acc: 0.9850
Epoch 111/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0230 - acc: 0.9906 - val_loss: 0.1069 - val_acc: 0.9850
Epoch 112/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0212 - acc: 0.9931 - val_loss: 0.1270 - val_acc: 0.9750
Epoch 113/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0174 - acc: 0.9929 - val_loss: 0.1433 - val_acc: 0.9800
Epoch 114/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0176 - acc: 0.9937 - val_loss: 0.1635 - val_acc: 0.9750
Epoch 115/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0187 - acc: 0.9914 - val_loss: 0.1946 - val_acc: 0.9800
Epoch 116/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0170 - acc: 0.9940 - val_loss: 0.1307 - val_acc: 0.9800
Epoch 117/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0191 - acc: 0.9931 - val_loss: 0.1708 - val_acc: 0.9750
Epoch 118/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0202 - acc: 0.9929 - val_loss: 0.1585 - val_acc: 0.9750
Epoch 119/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0233 - acc: 0.9906 - val_loss: 0.2191 - val_acc: 0.9800
Epoch 120/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0249 - acc: 0.9906 - val_loss: 0.1280 - val_acc: 0.9800
Epoch 121/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0176 - acc: 0.9934 - val_loss: 0.1379 - val_acc: 0.9750
Epoch 122/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0176 - acc: 0.9926 - val_loss: 0.1707 - val_acc: 0.9650
Epoch 123/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0135 - acc: 0.9969 - val_loss: 0.1797 - val_acc: 0.9800
Epoch 124/200
3500/3500 [==============================] - 0s 119us/step - loss: 0.0134 - acc: 0.9951 - val_loss: 0.1153 - val_acc: 0.9850
Epoch 125/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0194 - acc: 0.9926 - val_loss: 0.1389 - val_acc: 0.9800
Epoch 126/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0150 - acc: 0.9943 - val_loss: 0.1371 - val_acc: 0.9800
Epoch 127/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0119 - acc: 0.9954 - val_loss: 0.1469 - val_acc: 0.9800
Epoch 128/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0123 - acc: 0.9957 - val_loss: 0.1926 - val_acc: 0.9750
Epoch 129/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0113 - acc: 0.9960 - val_loss: 0.1895 - val_acc: 0.9800
Epoch 130/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0101 - acc: 0.9974 - val_loss: 0.1728 - val_acc: 0.9700
Epoch 131/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0122 - acc: 0.9949 - val_loss: 0.1876 - val_acc: 0.9750
Epoch 132/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0174 - acc: 0.9931 - val_loss: 0.1539 - val_acc: 0.9800
Epoch 133/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0155 - acc: 0.9940 - val_loss: 0.1753 - val_acc: 0.9750
Epoch 134/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0187 - acc: 0.9934 - val_loss: 0.1601 - val_acc: 0.9750
Epoch 135/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0150 - acc: 0.9931 - val_loss: 0.1608 - val_acc: 0.9800
Epoch 136/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0134 - acc: 0.9957 - val_loss: 0.1458 - val_acc: 0.9800
Epoch 137/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0153 - acc: 0.9951 - val_loss: 0.1633 - val_acc: 0.9750
Epoch 138/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0109 - acc: 0.9971 - val_loss: 0.1153 - val_acc: 0.9850
Epoch 139/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0127 - acc: 0.9946 - val_loss: 0.1802 - val_acc: 0.9800
Epoch 140/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0127 - acc: 0.9957 - val_loss: 0.1722 - val_acc: 0.9700
Epoch 141/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0179 - acc: 0.9934 - val_loss: 0.1640 - val_acc: 0.9700
Epoch 142/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0123 - acc: 0.9954 - val_loss: 0.1561 - val_acc: 0.9750
Epoch 143/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0203 - acc: 0.9923 - val_loss: 0.1458 - val_acc: 0.9700
Epoch 144/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0147 - acc: 0.9957 - val_loss: 0.1778 - val_acc: 0.9800
Epoch 145/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0164 - acc: 0.9940 - val_loss: 0.1452 - val_acc: 0.9750
Epoch 146/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0142 - acc: 0.9943 - val_loss: 0.1620 - val_acc: 0.9800
Epoch 147/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0145 - acc: 0.9949 - val_loss: 0.1461 - val_acc: 0.9800
Epoch 148/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0131 - acc: 0.9954 - val_loss: 0.1300 - val_acc: 0.9800
Epoch 149/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0135 - acc: 0.9951 - val_loss: 0.1566 - val_acc: 0.9850
Epoch 150/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0106 - acc: 0.9954 - val_loss: 0.1691 - val_acc: 0.9800
Epoch 151/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0150 - acc: 0.9946 - val_loss: 0.1220 - val_acc: 0.9800
Epoch 152/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0120 - acc: 0.9963 - val_loss: 0.1632 - val_acc: 0.9800
Epoch 153/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0135 - acc: 0.9934 - val_loss: 0.1630 - val_acc: 0.9800
Epoch 154/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0116 - acc: 0.9966 - val_loss: 0.1854 - val_acc: 0.9750
Epoch 155/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0113 - acc: 0.9969 - val_loss: 0.1405 - val_acc: 0.9850
Epoch 156/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0136 - acc: 0.9963 - val_loss: 0.1726 - val_acc: 0.9800
Epoch 157/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0120 - acc: 0.9966 - val_loss: 0.1316 - val_acc: 0.9800
Epoch 158/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0128 - acc: 0.9946 - val_loss: 0.1524 - val_acc: 0.9850
Epoch 159/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0132 - acc: 0.9949 - val_loss: 0.2071 - val_acc: 0.9850
Epoch 160/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0091 - acc: 0.9969 - val_loss: 0.1621 - val_acc: 0.9800
Epoch 161/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0120 - acc: 0.9960 - val_loss: 0.1849 - val_acc: 0.9800
Epoch 162/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0098 - acc: 0.9957 - val_loss: 0.1943 - val_acc: 0.9850
Epoch 163/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0141 - acc: 0.9954 - val_loss: 0.1670 - val_acc: 0.9850
Epoch 164/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0135 - acc: 0.9957 - val_loss: 0.1146 - val_acc: 0.9850
Epoch 165/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0120 - acc: 0.9951 - val_loss: 0.1274 - val_acc: 0.9850
Epoch 166/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0159 - acc: 0.9946 - val_loss: 0.2280 - val_acc: 0.9750
Epoch 167/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0122 - acc: 0.9946 - val_loss: 0.1370 - val_acc: 0.9850
Epoch 168/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0075 - acc: 0.9977 - val_loss: 0.1938 - val_acc: 0.9750
Epoch 169/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0103 - acc: 0.9957 - val_loss: 0.1665 - val_acc: 0.9800
Epoch 170/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0144 - acc: 0.9946 - val_loss: 0.1557 - val_acc: 0.9750
Epoch 171/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0110 - acc: 0.9960 - val_loss: 0.0962 - val_acc: 0.9850
Epoch 172/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0120 - acc: 0.9963 - val_loss: 0.1718 - val_acc: 0.9800
Epoch 173/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0106 - acc: 0.9957 - val_loss: 0.1887 - val_acc: 0.9700
Epoch 174/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0116 - acc: 0.9960 - val_loss: 0.1468 - val_acc: 0.9800
Epoch 175/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0120 - acc: 0.9951 - val_loss: 0.1990 - val_acc: 0.9700
Epoch 176/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0132 - acc: 0.9951 - val_loss: 0.2002 - val_acc: 0.9750
Epoch 177/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0087 - acc: 0.9963 - val_loss: 0.1581 - val_acc: 0.9800
Epoch 178/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0071 - acc: 0.9971 - val_loss: 0.1600 - val_acc: 0.9750
Epoch 179/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0074 - acc: 0.9969 - val_loss: 0.1856 - val_acc: 0.9750
Epoch 180/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0095 - acc: 0.9969 - val_loss: 0.2079 - val_acc: 0.9750
Epoch 181/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.1089 - val_acc: 0.9850
Epoch 182/200
3500/3500 [==============================] - 0s 125us/step - loss: 0.0181 - acc: 0.9943 - val_loss: 0.1407 - val_acc: 0.9750
Epoch 183/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0185 - acc: 0.9951 - val_loss: 0.2014 - val_acc: 0.9700
Epoch 184/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0149 - acc: 0.9949 - val_loss: 0.1645 - val_acc: 0.9800
Epoch 185/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0139 - acc: 0.9946 - val_loss: 0.1947 - val_acc: 0.9650
Epoch 186/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0116 - acc: 0.9966 - val_loss: 0.1629 - val_acc: 0.9750
Epoch 187/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0073 - acc: 0.9971 - val_loss: 0.1777 - val_acc: 0.9700
Epoch 188/200
3500/3500 [==============================] - 0s 126us/step - loss: 0.0101 - acc: 0.9966 - val_loss: 0.1115 - val_acc: 0.9800
Epoch 189/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0114 - acc: 0.9960 - val_loss: 0.1550 - val_acc: 0.9800
Epoch 190/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0076 - acc: 0.9974 - val_loss: 0.1601 - val_acc: 0.9750
Epoch 191/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0066 - acc: 0.9974 - val_loss: 0.1710 - val_acc: 0.9850
Epoch 192/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0083 - acc: 0.9966 - val_loss: 0.2047 - val_acc: 0.9800
Epoch 193/200
3500/3500 [==============================] - 0s 122us/step - loss: 0.0080 - acc: 0.9977 - val_loss: 0.1595 - val_acc: 0.9850
Epoch 194/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0070 - acc: 0.9966 - val_loss: 0.1811 - val_acc: 0.9850
Epoch 195/200
3500/3500 [==============================] - 0s 123us/step - loss: 0.0146 - acc: 0.9929 - val_loss: 0.1487 - val_acc: 0.9850
Epoch 196/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0144 - acc: 0.9949 - val_loss: 0.2478 - val_acc: 0.9750
Epoch 197/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0144 - acc: 0.9951 - val_loss: 0.1708 - val_acc: 0.9850
Epoch 198/200
3500/3500 [==============================] - 0s 120us/step - loss: 0.0098 - acc: 0.9963 - val_loss: 0.1833 - val_acc: 0.9800
Epoch 199/200
3500/3500 [==============================] - 0s 124us/step - loss: 0.0070 - acc: 0.9971 - val_loss: 0.1787 - val_acc: 0.9800
Epoch 200/200
3500/3500 [==============================] - 0s 121us/step - loss: 0.0096 - acc: 0.9969 - val_loss: 0.1530 - val_acc: 0.9800
In [6]:
import matplotlib.pyplot as plt
%matplotlib inline
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']

epochs = range(len(acc))

plt.plot(epochs, acc, 'bo', label='Training acc')
plt.plot(epochs, val_acc, 'b', label='Validation acc')
plt.title('Training and validation accuracy')
plt.legend()

plt.figure()

plt.plot(epochs, loss, 'bo', label='Training loss')
plt.plot(epochs, val_loss, 'b', label='Validation loss')
plt.title('Training and validation loss')
plt.legend()

plt.show()
دوره مقدماتی یادگیری عمیق
علیرضا اخوان پور
پنج شنبه، ۱۸ بهمن ۱۳۹۷
Class.Vision - AkhavanPour.ir - GitHub
In [ ]: