به نام خدا

رگرسیون تخمین قیمت بر اساس تصویر خانه با شبکه های کانولوشنالی

Keras, Regression, and CNNs

  • Removing the fully-connected softmax classifier layer typically used for classification
  • Replacing it with a fully-connected layer with a single node along with a linear activation function.
  • Training the model with a continuous value prediction loss function such as mean squared error, mean absolute error, mean absolute percentage error, etc.

معرفی مجموعه داده تخممین قیمت منازل

این مجموعه داده در مقاله سال 2016 تحت عنوان 2016 House price estimation from visual and textual features معرفی و منتشر شده است.

https://github.com/emanhamed/Houses-dataset
https://arxiv.org/pdf/1609.08399.pdf

In [1]:
import cv2
import numpy as np
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import MinMaxScaler
import os
import glob
import matplotlib.pyplot as plt
from keras.optimizers import Adam
%matplotlib inline
Using TensorFlow backend.
In [2]:
inputPath = "D:/dataset/Houses-dataset/Houses Dataset/HousesInfo.txt"
datasetPath = "D:/dataset/Houses-dataset/Houses Dataset"

cols = ["bedrooms", "bathrooms", "area", "zipcode", "price"]
df = pd.read_csv(inputPath, sep=" ", header=None, names=cols)


zipcodes, counts = np.unique(df["zipcode"], return_counts=True)

# loop over each of the unique zip codes and their corresponding
# count
for (zipcode, count) in zip(zipcodes, counts):
    # the zip code counts for our housing dataset is *extremely*
    # unbalanced (some only having 1 or 2 houses per zip code)
    # so let's sanitize our data by removing any houses with less
    # than 25 houses per zip code
    if count < 25:
        idxs = df[df["zipcode"] == zipcode].index
        df.drop(idxs, inplace=True)
In [ ]:
# initialize our images array (i.e., the house images themselves)
images = []

# loop over the indexes of the houses
for i in df.index.values:
    # find the four images for the house and sort the file paths,
    # ensuring the four are always in the *same order*
    basePath = os.path.sep.join([datasetPath, "{}_*".format(i + 1)])
    housePaths = sorted(list(glob.glob(basePath)))
    # initialize our list of input images along with the output image
    # after *combining* the four input images
    inputImages = []
    outputImage = np.zeros((64, 64, 3), dtype="uint8")

    # loop over the input house paths
    for housePath in housePaths:
        # load the input image, resize it to be 32 32, and then
        # update the list of input images
        image = cv2.imread(housePath)
        image = cv2.resize(image, (32, 32))
        inputImages.append(image)

    # tile the four input images in the output image such the first
    # image goes in the top-right corner, the second image in the
    # top-left corner, the third image in the bottom-right corner,
    # and the final image in the bottom-left corner
    outputImage[0:32, 0:32] = inputImages[0]
    outputImage[0:32, 32:64] = inputImages[1]
    outputImage[32:64, 32:64] = inputImages[2]
    outputImage[32:64, 0:32] = inputImages[3]

    # add the tiled image to our set of images the network will be
    # trained on
    images.append(outputImage)
images = np.array(images)
In [ ]:
images.shape
In [11]:
plt.imshow(images[37][...,::-1])
Out[11]:
<matplotlib.image.AxesImage at 0x14b7a9a0390>
In [12]:
images = images / 255.0
In [13]:
# partition the data into training and testing splits using 75% of
# the data for training and the remaining 25% for testing
split = train_test_split(df, images, test_size=0.25, random_state=42)
(trainAttrX, testAttrX, trainImagesX, testImagesX) = split
In [14]:
# find the largest house price in the training set and use it to
# scale our house prices to the range [0, 1] (will lead to better
# training and convergence)
maxPrice = trainAttrX["price"].max()
trainY = trainAttrX["price"] / maxPrice
testY = testAttrX["price"] / maxPrice
In [16]:
from keras.models import Model, Input
from keras.layers.convolutional import Conv2D, MaxPooling2D
from keras.layers import Activation, Dense, Flatten, Dropout
from keras.layers.normalization import BatchNormalization

width, height, depth = 64, 64, 3
filters=(16, 32, 64)
# initialize the input shape and channel dimension, assuming
# TensorFlow/channels-last ordering
inputShape = (height, width, depth)
chanDim = -1

# define the model input
inputs = Input(shape=inputShape)

# loop over the number of filters
for (i, f) in enumerate(filters):
    # if this is the first CONV layer then set the input
    # appropriately
    if i == 0:
        x = inputs

    # CONV => RELU => BN => POOL
    x = Conv2D(f, (3, 3), padding="same")(x)
    x = Activation("relu")(x)
    x = BatchNormalization(axis=chanDim)(x)
    x = MaxPooling2D(pool_size=(2, 2))(x)
    
# flatten the volume, then FC => RELU => BN => DROPOUT
x = Flatten()(x)
x = Dense(16)(x)
x = Activation("relu")(x)
x = BatchNormalization(axis=chanDim)(x)
x = Dropout(0.5)(x)

# apply another FC layer, this one to match the number of nodes
# coming out of the MLP
x = Dense(4)(x)
x = Activation("relu")(x)


x = Dense(1, activation="linear")(x)

# construct the CNN
model = Model(inputs, x)
In [17]:
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 64, 64, 3)         0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 64, 64, 16)        448       
_________________________________________________________________
activation_1 (Activation)    (None, 64, 64, 16)        0         
_________________________________________________________________
batch_normalization_1 (Batch (None, 64, 64, 16)        64        
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 32, 32, 16)        0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 32, 32, 32)        4640      
_________________________________________________________________
activation_2 (Activation)    (None, 32, 32, 32)        0         
_________________________________________________________________
batch_normalization_2 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
activation_3 (Activation)    (None, 16, 16, 64)        0         
_________________________________________________________________
batch_normalization_3 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_1 (Dense)              (None, 16)                65552     
_________________________________________________________________
activation_4 (Activation)    (None, 16)                0         
_________________________________________________________________
batch_normalization_4 (Batch (None, 16)                64        
_________________________________________________________________
dropout_1 (Dropout)          (None, 16)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 4)                 68        
_________________________________________________________________
activation_5 (Activation)    (None, 4)                 0         
_________________________________________________________________
dense_3 (Dense)              (None, 1)                 5         
=================================================================
Total params: 89,721
Trainable params: 89,465
Non-trainable params: 256
_________________________________________________________________
In [18]:
opt = Adam(lr=1e-3, decay=1e-3 / 200)
model.compile(loss="mean_absolute_percentage_error", optimizer=opt)
In [25]:
model.fit(trainImagesX, trainY, validation_data=(testImagesX, testY), 
          epochs=200, batch_size=8)
Train on 271 samples, validate on 91 samples
Epoch 1/200
271/271 [==============================] - 2s 8ms/step - loss: 1941.2076 - val_loss: 1598.4589
Epoch 2/200
271/271 [==============================] - 1s 2ms/step - loss: 1349.4392 - val_loss: 1729.1964
Epoch 3/200
271/271 [==============================] - 1s 2ms/step - loss: 1041.9592 - val_loss: 1932.3454
Epoch 4/200
271/271 [==============================] - 1s 2ms/step - loss: 867.0342 - val_loss: 635.8037
Epoch 5/200
271/271 [==============================] - 1s 2ms/step - loss: 783.2866 - val_loss: 365.9065
Epoch 6/200
271/271 [==============================] - 1s 2ms/step - loss: 454.9674 - val_loss: 301.9118
Epoch 7/200
271/271 [==============================] - 1s 2ms/step - loss: 501.3928 - val_loss: 281.8461
Epoch 8/200
271/271 [==============================] - 1s 2ms/step - loss: 451.1617 - val_loss: 276.9117
Epoch 9/200
271/271 [==============================] - 1s 2ms/step - loss: 355.2670 - val_loss: 250.1414
Epoch 10/200
271/271 [==============================] - 1s 2ms/step - loss: 363.1637 - val_loss: 282.8624
Epoch 11/200
271/271 [==============================] - 1s 2ms/step - loss: 276.9560 - val_loss: 198.1363
Epoch 12/200
271/271 [==============================] - 1s 2ms/step - loss: 262.0572 - val_loss: 150.9289
Epoch 13/200
271/271 [==============================] - 1s 2ms/step - loss: 232.9445 - val_loss: 147.8800
Epoch 14/200
271/271 [==============================] - 1s 2ms/step - loss: 166.1101 - val_loss: 129.1619
Epoch 15/200
271/271 [==============================] - 1s 2ms/step - loss: 207.4597 - val_loss: 134.4521
Epoch 16/200
271/271 [==============================] - 1s 2ms/step - loss: 169.5462 - val_loss: 126.3546
Epoch 17/200
271/271 [==============================] - 1s 2ms/step - loss: 141.5344 - val_loss: 158.5210
Epoch 18/200
271/271 [==============================] - 1s 2ms/step - loss: 173.2246 - val_loss: 140.3007
Epoch 19/200
271/271 [==============================] - 1s 2ms/step - loss: 166.9047 - val_loss: 127.9012
Epoch 20/200
271/271 [==============================] - 1s 2ms/step - loss: 135.4252 - val_loss: 106.9372
Epoch 21/200
271/271 [==============================] - 1s 2ms/step - loss: 127.6163 - val_loss: 93.9979
Epoch 22/200
271/271 [==============================] - 1s 2ms/step - loss: 117.8238 - val_loss: 102.7956
Epoch 23/200
271/271 [==============================] - 1s 2ms/step - loss: 104.7445 - val_loss: 97.9954
Epoch 24/200
271/271 [==============================] - 1s 2ms/step - loss: 103.5892 - val_loss: 120.5099
Epoch 25/200
271/271 [==============================] - 1s 2ms/step - loss: 95.6928 - val_loss: 96.5395
Epoch 26/200
271/271 [==============================] - 1s 2ms/step - loss: 90.7450 - val_loss: 87.4946
Epoch 27/200
271/271 [==============================] - 1s 2ms/step - loss: 97.4440 - val_loss: 83.8994
Epoch 28/200
271/271 [==============================] - 1s 2ms/step - loss: 97.7546 - val_loss: 80.0731
Epoch 29/200
271/271 [==============================] - 1s 2ms/step - loss: 86.7150 - val_loss: 76.3486
Epoch 30/200
271/271 [==============================] - 1s 2ms/step - loss: 85.5285 - val_loss: 77.0910
Epoch 31/200
271/271 [==============================] - 1s 2ms/step - loss: 77.9057 - val_loss: 73.9788
Epoch 32/200
271/271 [==============================] - 1s 2ms/step - loss: 87.8148 - val_loss: 73.4906
Epoch 33/200
271/271 [==============================] - 1s 2ms/step - loss: 77.8717 - val_loss: 71.9323
Epoch 34/200
271/271 [==============================] - 1s 2ms/step - loss: 84.0230 - val_loss: 73.1694
Epoch 35/200
271/271 [==============================] - 1s 2ms/step - loss: 75.8681 - val_loss: 74.2912
Epoch 36/200
271/271 [==============================] - 1s 2ms/step - loss: 76.8371 - val_loss: 74.3320
Epoch 37/200
271/271 [==============================] - 1s 2ms/step - loss: 76.8491 - val_loss: 75.8989
Epoch 38/200
271/271 [==============================] - 1s 2ms/step - loss: 74.8194 - val_loss: 79.6818
Epoch 39/200
271/271 [==============================] - 1s 2ms/step - loss: 80.1591 - val_loss: 75.1580
Epoch 40/200
271/271 [==============================] - 1s 2ms/step - loss: 69.2475 - val_loss: 70.0732
Epoch 41/200
271/271 [==============================] - 1s 2ms/step - loss: 72.4256 - val_loss: 68.1792
Epoch 42/200
271/271 [==============================] - 1s 2ms/step - loss: 81.4075 - val_loss: 67.9730
Epoch 43/200
271/271 [==============================] - 1s 2ms/step - loss: 80.5002 - val_loss: 66.7713
Epoch 44/200
271/271 [==============================] - 1s 2ms/step - loss: 70.6514 - val_loss: 67.9946
Epoch 45/200
271/271 [==============================] - 1s 2ms/step - loss: 69.1854 - val_loss: 65.9841
Epoch 46/200
271/271 [==============================] - 1s 2ms/step - loss: 63.1121 - val_loss: 63.9555
Epoch 47/200
271/271 [==============================] - 1s 2ms/step - loss: 63.1460 - val_loss: 63.3466
Epoch 48/200
271/271 [==============================] - 1s 2ms/step - loss: 68.1012 - val_loss: 64.8502
Epoch 49/200
271/271 [==============================] - 1s 2ms/step - loss: 68.4786 - val_loss: 63.1072
Epoch 50/200
271/271 [==============================] - 1s 2ms/step - loss: 64.3864 - val_loss: 62.7881
Epoch 51/200
271/271 [==============================] - 1s 2ms/step - loss: 66.0272 - val_loss: 62.8959
Epoch 52/200
271/271 [==============================] - 1s 2ms/step - loss: 62.2021 - val_loss: 61.6331
Epoch 53/200
271/271 [==============================] - 1s 2ms/step - loss: 55.0229 - val_loss: 60.2772
Epoch 54/200
271/271 [==============================] - 1s 2ms/step - loss: 63.4943 - val_loss: 60.6119
Epoch 55/200
271/271 [==============================] - 1s 2ms/step - loss: 55.4358 - val_loss: 59.7726
Epoch 56/200
271/271 [==============================] - 1s 2ms/step - loss: 51.7884 - val_loss: 60.0101
Epoch 57/200
271/271 [==============================] - 1s 2ms/step - loss: 56.6211 - val_loss: 60.3263
Epoch 58/200
271/271 [==============================] - 1s 2ms/step - loss: 61.0704 - val_loss: 61.2067
Epoch 59/200
271/271 [==============================] - 1s 2ms/step - loss: 57.7992 - val_loss: 60.9259
Epoch 60/200
271/271 [==============================] - 1s 2ms/step - loss: 61.6611 - val_loss: 58.9021
Epoch 61/200
271/271 [==============================] - 1s 2ms/step - loss: 54.9801 - val_loss: 58.6368
Epoch 62/200
271/271 [==============================] - 1s 2ms/step - loss: 53.4260 - val_loss: 60.7833
Epoch 63/200
271/271 [==============================] - 1s 2ms/step - loss: 57.8293 - val_loss: 62.1385
Epoch 64/200
271/271 [==============================] - 1s 2ms/step - loss: 57.9541 - val_loss: 66.0294
Epoch 65/200
271/271 [==============================] - 1s 2ms/step - loss: 62.8338 - val_loss: 65.5602
Epoch 66/200
271/271 [==============================] - 1s 2ms/step - loss: 56.5498 - val_loss: 59.0908
Epoch 67/200
271/271 [==============================] - 1s 2ms/step - loss: 56.8078 - val_loss: 57.6726
Epoch 68/200
271/271 [==============================] - 1s 2ms/step - loss: 58.0820 - val_loss: 58.0844
Epoch 69/200
271/271 [==============================] - 1s 2ms/step - loss: 57.5578 - val_loss: 61.0063
Epoch 70/200
271/271 [==============================] - 1s 2ms/step - loss: 58.7940 - val_loss: 60.4395
Epoch 71/200
271/271 [==============================] - 1s 2ms/step - loss: 53.9408 - val_loss: 68.9101
Epoch 72/200
271/271 [==============================] - 1s 2ms/step - loss: 55.9636 - val_loss: 57.6011
Epoch 73/200
271/271 [==============================] - 1s 2ms/step - loss: 56.0595 - val_loss: 58.7377
Epoch 74/200
271/271 [==============================] - 1s 2ms/step - loss: 54.0134 - val_loss: 59.3264
Epoch 75/200
271/271 [==============================] - 1s 2ms/step - loss: 55.8834 - val_loss: 62.5219
Epoch 76/200
271/271 [==============================] - 1s 2ms/step - loss: 54.9016 - val_loss: 61.5980
Epoch 77/200
271/271 [==============================] - 1s 2ms/step - loss: 51.3276 - val_loss: 61.8531
Epoch 78/200
271/271 [==============================] - 1s 2ms/step - loss: 50.8414 - val_loss: 61.7948
Epoch 79/200
271/271 [==============================] - 1s 2ms/step - loss: 55.0343 - val_loss: 61.6644
Epoch 80/200
271/271 [==============================] - 1s 2ms/step - loss: 51.7721 - val_loss: 61.4173
Epoch 81/200
271/271 [==============================] - 1s 2ms/step - loss: 52.7649 - val_loss: 60.3755
Epoch 82/200
271/271 [==============================] - 1s 2ms/step - loss: 48.5776 - val_loss: 60.1092
Epoch 83/200
271/271 [==============================] - 1s 2ms/step - loss: 49.7893 - val_loss: 59.4538
Epoch 84/200
271/271 [==============================] - 1s 2ms/step - loss: 55.1904 - val_loss: 59.1017
Epoch 85/200
271/271 [==============================] - 1s 2ms/step - loss: 47.6957 - val_loss: 59.8379
Epoch 86/200
271/271 [==============================] - 1s 2ms/step - loss: 48.6987 - val_loss: 59.6647
Epoch 87/200
271/271 [==============================] - 1s 2ms/step - loss: 52.5514 - val_loss: 59.4988
Epoch 88/200
271/271 [==============================] - 1s 2ms/step - loss: 51.9328 - val_loss: 59.7410
Epoch 89/200
271/271 [==============================] - 1s 2ms/step - loss: 51.9078 - val_loss: 62.1558
Epoch 90/200
271/271 [==============================] - 1s 2ms/step - loss: 56.1448 - val_loss: 60.0168
Epoch 91/200
271/271 [==============================] - 1s 2ms/step - loss: 51.4093 - val_loss: 59.1714
Epoch 92/200
271/271 [==============================] - 1s 2ms/step - loss: 49.7055 - val_loss: 58.1503
Epoch 93/200
271/271 [==============================] - 1s 2ms/step - loss: 55.0852 - val_loss: 60.1439
Epoch 94/200
271/271 [==============================] - 1s 2ms/step - loss: 48.4064 - val_loss: 59.9672
Epoch 95/200
271/271 [==============================] - 1s 2ms/step - loss: 43.5815 - val_loss: 61.0624
Epoch 96/200
271/271 [==============================] - 1s 2ms/step - loss: 47.6812 - val_loss: 60.3157
Epoch 97/200
271/271 [==============================] - 1s 2ms/step - loss: 47.5935 - val_loss: 58.0523
Epoch 98/200
271/271 [==============================] - 1s 2ms/step - loss: 49.6042 - val_loss: 57.4796
Epoch 99/200
271/271 [==============================] - 1s 2ms/step - loss: 46.4597 - val_loss: 58.1220
Epoch 100/200
271/271 [==============================] - 1s 2ms/step - loss: 46.6391 - val_loss: 58.1928
Epoch 101/200
271/271 [==============================] - 1s 2ms/step - loss: 45.6476 - val_loss: 59.3575
Epoch 102/200
271/271 [==============================] - 1s 2ms/step - loss: 45.1405 - val_loss: 59.4229
Epoch 103/200
271/271 [==============================] - 1s 2ms/step - loss: 44.2526 - val_loss: 55.9954
Epoch 104/200
271/271 [==============================] - 1s 2ms/step - loss: 41.9346 - val_loss: 57.4309
Epoch 105/200
271/271 [==============================] - 1s 2ms/step - loss: 47.2286 - val_loss: 57.8695
Epoch 106/200
271/271 [==============================] - 1s 2ms/step - loss: 46.6076 - val_loss: 57.5300
Epoch 107/200
271/271 [==============================] - 1s 2ms/step - loss: 40.6039 - val_loss: 57.6450
Epoch 108/200
271/271 [==============================] - 1s 2ms/step - loss: 42.9077 - val_loss: 58.6313
Epoch 109/200
271/271 [==============================] - 1s 2ms/step - loss: 45.4506 - val_loss: 60.5205
Epoch 110/200
271/271 [==============================] - 1s 2ms/step - loss: 42.8246 - val_loss: 55.1690
Epoch 111/200
271/271 [==============================] - 1s 2ms/step - loss: 44.5604 - val_loss: 55.0717
Epoch 112/200
271/271 [==============================] - 1s 2ms/step - loss: 44.5177 - val_loss: 53.3369
Epoch 113/200
271/271 [==============================] - 1s 2ms/step - loss: 48.0578 - val_loss: 53.2066
Epoch 114/200
271/271 [==============================] - 1s 2ms/step - loss: 41.9994 - val_loss: 54.3654
Epoch 115/200
271/271 [==============================] - 1s 2ms/step - loss: 43.6618 - val_loss: 55.1822
Epoch 116/200
271/271 [==============================] - 1s 2ms/step - loss: 44.3666 - val_loss: 55.6363
Epoch 117/200
271/271 [==============================] - 1s 2ms/step - loss: 42.8058 - val_loss: 55.9653
Epoch 118/200
271/271 [==============================] - 1s 2ms/step - loss: 42.2560 - val_loss: 57.0575
Epoch 119/200
271/271 [==============================] - 1s 2ms/step - loss: 42.0205 - val_loss: 59.3373
Epoch 120/200
271/271 [==============================] - 1s 2ms/step - loss: 43.4932 - val_loss: 55.9441
Epoch 121/200
271/271 [==============================] - 1s 2ms/step - loss: 46.0086 - val_loss: 54.2724
Epoch 122/200
271/271 [==============================] - 1s 2ms/step - loss: 39.7153 - val_loss: 52.1048
Epoch 123/200
271/271 [==============================] - 1s 2ms/step - loss: 38.8943 - val_loss: 54.8628
Epoch 124/200
271/271 [==============================] - 1s 2ms/step - loss: 40.9152 - val_loss: 57.6445
Epoch 125/200
271/271 [==============================] - 1s 2ms/step - loss: 45.5785 - val_loss: 51.8358
Epoch 126/200
271/271 [==============================] - 1s 2ms/step - loss: 44.4005 - val_loss: 52.2125
Epoch 127/200
271/271 [==============================] - 1s 2ms/step - loss: 39.8228 - val_loss: 51.4625
Epoch 128/200
271/271 [==============================] - 1s 2ms/step - loss: 38.6617 - val_loss: 55.9242
Epoch 129/200
271/271 [==============================] - 1s 2ms/step - loss: 42.0928 - val_loss: 52.7262
Epoch 130/200
271/271 [==============================] - 1s 2ms/step - loss: 45.9031 - val_loss: 56.4325
Epoch 131/200
271/271 [==============================] - 1s 2ms/step - loss: 41.2215 - val_loss: 61.3256
Epoch 132/200
271/271 [==============================] - 1s 2ms/step - loss: 42.8242 - val_loss: 60.5975
Epoch 133/200
271/271 [==============================] - 1s 2ms/step - loss: 43.7571 - val_loss: 54.3824
Epoch 134/200
271/271 [==============================] - 1s 2ms/step - loss: 42.7871 - val_loss: 52.5709
Epoch 135/200
271/271 [==============================] - 1s 2ms/step - loss: 39.6951 - val_loss: 58.8492
Epoch 136/200
271/271 [==============================] - 1s 2ms/step - loss: 40.0618 - val_loss: 54.0645
Epoch 137/200
271/271 [==============================] - 1s 2ms/step - loss: 40.5835 - val_loss: 53.3730
Epoch 138/200
271/271 [==============================] - 1s 2ms/step - loss: 38.9493 - val_loss: 57.8824
Epoch 139/200
271/271 [==============================] - 1s 2ms/step - loss: 44.4155 - val_loss: 53.3657
Epoch 140/200
271/271 [==============================] - 1s 2ms/step - loss: 39.1462 - val_loss: 56.8153
Epoch 141/200
271/271 [==============================] - 1s 2ms/step - loss: 44.8833 - val_loss: 53.0148
Epoch 142/200
271/271 [==============================] - 1s 2ms/step - loss: 40.1259 - val_loss: 53.2730
Epoch 143/200
271/271 [==============================] - 1s 2ms/step - loss: 41.0566 - val_loss: 52.7570
Epoch 144/200
271/271 [==============================] - 1s 2ms/step - loss: 38.5776 - val_loss: 52.7389
Epoch 145/200
271/271 [==============================] - 1s 3ms/step - loss: 40.2168 - val_loss: 55.6658
Epoch 146/200
271/271 [==============================] - 1s 2ms/step - loss: 39.5756 - val_loss: 56.4316
Epoch 147/200
271/271 [==============================] - 1s 2ms/step - loss: 43.5220 - val_loss: 59.9272
Epoch 148/200
271/271 [==============================] - 1s 2ms/step - loss: 42.7704 - val_loss: 55.0070
Epoch 149/200
271/271 [==============================] - 1s 2ms/step - loss: 44.5288 - val_loss: 54.5684
Epoch 150/200
271/271 [==============================] - 1s 2ms/step - loss: 41.7768 - val_loss: 55.8618
Epoch 151/200
271/271 [==============================] - 1s 2ms/step - loss: 40.4441 - val_loss: 54.0163
Epoch 152/200
271/271 [==============================] - 1s 2ms/step - loss: 42.4159 - val_loss: 53.2123
Epoch 153/200
271/271 [==============================] - 1s 2ms/step - loss: 38.7781 - val_loss: 72.8213
Epoch 154/200
271/271 [==============================] - 1s 2ms/step - loss: 37.8940 - val_loss: 58.3642
Epoch 155/200
271/271 [==============================] - 1s 2ms/step - loss: 39.2917 - val_loss: 60.7090
Epoch 156/200
271/271 [==============================] - 1s 2ms/step - loss: 39.4518 - val_loss: 58.7714
Epoch 157/200
271/271 [==============================] - 1s 2ms/step - loss: 40.8768 - val_loss: 57.0239
Epoch 158/200
271/271 [==============================] - 1s 2ms/step - loss: 38.3170 - val_loss: 58.7507
Epoch 159/200
271/271 [==============================] - 1s 2ms/step - loss: 35.4559 - val_loss: 57.2297
Epoch 160/200
271/271 [==============================] - 1s 2ms/step - loss: 39.1284 - val_loss: 57.2402
Epoch 161/200
271/271 [==============================] - 1s 2ms/step - loss: 39.2628 - val_loss: 56.6112
Epoch 162/200
271/271 [==============================] - 1s 2ms/step - loss: 37.5581 - val_loss: 57.4274
Epoch 163/200
271/271 [==============================] - 1s 2ms/step - loss: 41.6783 - val_loss: 54.7579
Epoch 164/200
271/271 [==============================] - 1s 2ms/step - loss: 39.1945 - val_loss: 56.4013
Epoch 165/200
271/271 [==============================] - 1s 2ms/step - loss: 37.5378 - val_loss: 56.9956
Epoch 166/200
271/271 [==============================] - 1s 2ms/step - loss: 39.4423 - val_loss: 53.9008
Epoch 167/200
271/271 [==============================] - 1s 2ms/step - loss: 36.7210 - val_loss: 54.4750
Epoch 168/200
271/271 [==============================] - 1s 2ms/step - loss: 36.1802 - val_loss: 62.0263
Epoch 169/200
271/271 [==============================] - 1s 2ms/step - loss: 35.3102 - val_loss: 66.2232
Epoch 170/200
271/271 [==============================] - 1s 2ms/step - loss: 42.6371 - val_loss: 54.3117
Epoch 171/200
271/271 [==============================] - 1s 2ms/step - loss: 36.9732 - val_loss: 58.4221
Epoch 172/200
271/271 [==============================] - 1s 2ms/step - loss: 37.7911 - val_loss: 55.7692
Epoch 173/200
271/271 [==============================] - 1s 2ms/step - loss: 36.1481 - val_loss: 57.0019
Epoch 174/200
271/271 [==============================] - 1s 2ms/step - loss: 36.7114 - val_loss: 57.5000
Epoch 175/200
271/271 [==============================] - 1s 2ms/step - loss: 36.8492 - val_loss: 59.3004
Epoch 176/200
271/271 [==============================] - 1s 2ms/step - loss: 40.1666 - val_loss: 63.9396
Epoch 177/200
271/271 [==============================] - 1s 2ms/step - loss: 40.1095 - val_loss: 54.8077
Epoch 178/200
271/271 [==============================] - 1s 2ms/step - loss: 35.3966 - val_loss: 53.7531
Epoch 179/200
271/271 [==============================] - 1s 2ms/step - loss: 40.2829 - val_loss: 537.0343
Epoch 180/200
271/271 [==============================] - 1s 2ms/step - loss: 37.6777 - val_loss: 67.1108
Epoch 181/200
271/271 [==============================] - 1s 2ms/step - loss: 34.8431 - val_loss: 51.6340
Epoch 182/200
271/271 [==============================] - 1s 2ms/step - loss: 39.4149 - val_loss: 70.9775
Epoch 183/200
271/271 [==============================] - 1s 2ms/step - loss: 40.8197 - val_loss: 52.0216
Epoch 184/200
271/271 [==============================] - 1s 2ms/step - loss: 36.1693 - val_loss: 49.2164
Epoch 185/200
271/271 [==============================] - 1s 2ms/step - loss: 36.1044 - val_loss: 52.4526
Epoch 186/200
271/271 [==============================] - 1s 2ms/step - loss: 36.5456 - val_loss: 53.6255
Epoch 187/200
271/271 [==============================] - 1s 2ms/step - loss: 39.1817 - val_loss: 54.7255
Epoch 188/200
271/271 [==============================] - 1s 2ms/step - loss: 36.1172 - val_loss: 57.5766
Epoch 189/200
271/271 [==============================] - 1s 2ms/step - loss: 35.6142 - val_loss: 56.6055
Epoch 190/200
271/271 [==============================] - 1s 2ms/step - loss: 33.8975 - val_loss: 56.3410
Epoch 191/200
271/271 [==============================] - 1s 2ms/step - loss: 37.2100 - val_loss: 53.9693
Epoch 192/200
271/271 [==============================] - 1s 2ms/step - loss: 35.3192 - val_loss: 54.6900
Epoch 193/200
271/271 [==============================] - 1s 2ms/step - loss: 35.6475 - val_loss: 55.5354
Epoch 194/200
271/271 [==============================] - 1s 2ms/step - loss: 35.7681 - val_loss: 52.6906
Epoch 195/200
271/271 [==============================] - 1s 2ms/step - loss: 35.7354 - val_loss: 53.6214
Epoch 196/200
271/271 [==============================] - 1s 2ms/step - loss: 34.1955 - val_loss: 50.4041
Epoch 197/200
271/271 [==============================] - 1s 2ms/step - loss: 36.7574 - val_loss: 57.1647
Epoch 198/200
271/271 [==============================] - 1s 2ms/step - loss: 32.5557 - val_loss: 54.9093
Epoch 199/200
271/271 [==============================] - 1s 2ms/step - loss: 35.6813 - val_loss: 53.8704
Epoch 200/200
271/271 [==============================] - 1s 2ms/step - loss: 36.0265 - val_loss: 53.8835
Out[25]:
<keras.callbacks.History at 0x1c9a2c1ca90>
دوره مقدماتی یادگیری عمیق
علیرضا اخوان پور
پنج شنبه، ۲۵ بهمن ۱۳۹۷
Class.Vision - AkhavanPour.ir - GitHub