به نام خدا

رگرسیون یا Regression داده های ساختار یافته

معرفی مجموعه داده تخممین قیمت منازل

این مجموعه داده در مقاله سال 2016 تحت عنوان 2016 House price estimation from visual and textual features معرفی و منتشر شده است.

https://github.com/emanhamed/Houses-dataset
https://arxiv.org/pdf/1609.08399.pdf

In [1]:
# import the necessary packages
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
from sklearn.preprocessing import LabelBinarizer
from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import train_test_split
from keras.optimizers import Adam
import pandas as pd
import numpy as np
import glob
import cv2
import os
import locale
Using TensorFlow backend.
In [2]:
cols = ["bedrooms", "bathrooms", "area", "zipcode", "price"]
df = pd.read_csv("D:/dataset/Houses-dataset/Houses Dataset/HousesInfo.txt", sep=" ", header=None, names=cols)
In [3]:
df.head()
Out[3]:
bedrooms bathrooms area zipcode price
0 4 4.0 4053 85255 869500.0
1 4 3.0 3343 36372 865200.0
2 3 4.0 3923 85266 889000.0
3 5 5.0 4022 85262 910000.0
4 3 4.0 4116 85266 971226.0
کد پستی های منحصر به فرد و تعدادشان
In [5]:
zipcodes, counts = np.unique(df["zipcode"], return_counts=True)
#dict(zip(zipcodes, counts))
In [6]:
df.shape
Out[6]:
(535, 5)
حذف رکوردهایی که zip code آن منزل ها zip code شان کمتر از 25 بار تکرا شده است
In [7]:
# loop over each of the unique zip codes and their corresponding
# count
for (zipcode, count) in zip(zipcodes, counts):
    # the zip code counts for our housing dataset is *extremely*
    # unbalanced (some only having 1 or 2 houses per zip code)
    # so let's sanitize our data by removing any houses with less
    # than 25 houses per zip code
    if count < 25:
        idxs = df[df["zipcode"] == zipcode].index
        df.drop(idxs, inplace=True)
In [8]:
df.shape
Out[8]:
(362, 5)
جدا کردن داده های موجود به داده های آموزش و آزمون
In [9]:
(train, test) = train_test_split(df, test_size=0.25, random_state=42)
print(train.shape)
print(test.shape)
(271, 5)
(91, 5)

پیش پردازش داده ها:

نرمال سازی داده ها یا Scaling داده ها بین 0 و 1
In [10]:
# find the largest house price in the training set and use it to
# scale our house prices to the range [0, 1] (this will lead to
# better training and convergence)
maxPrice = train["price"].max()
trainY = train["price"] / maxPrice
testY = test["price"] / maxPrice
In [12]:
# initialize the column names of the continuous data
continuous = ["bedrooms", "bathrooms", "area"]

# performin min-max scaling each continuous feature column to
# the range [0, 1]
cs = MinMaxScaler()
trainContinuous = cs.fit_transform(train[continuous])
testContinuous = cs.transform(test[continuous])
C:\Users\alire\Miniconda3\envs\tensorflow\lib\site-packages\sklearn\preprocessing\data.py:323: DataConversionWarning: Data with input dtype int64, float64 were all converted to float64 by MinMaxScaler.
  return self.partial_fit(X, y)
تبدیل مقادیر گسسته به one-hot
In [14]:
# one-hot encode the zip code categorical data (by definition of
# one-hot encoing, all output features are now in the range [0, 1])
zipBinarizer = LabelBinarizer().fit(df["zipcode"])
trainCategorical = zipBinarizer.transform(train["zipcode"])
testCategorical = zipBinarizer.transform(test["zipcode"])
ترکیب ویژگی‌های گستته و پیوسته
In [16]:
# construct our training and testing data points by concatenating
# the categorical features with the continuous features
trainX = np.hstack([trainCategorical, trainContinuous])
testX = np.hstack([testCategorical, testContinuous])

print(trainX.shape)
print(testX.shape)
(271, 10)
(91, 10)

تعریف معماری مدل (model architecture)

In [19]:
dim = trainX.shape[1]
# define our MLP network
model = Sequential()
model.add(Dense(8, input_dim=dim, activation="relu"))
model.add(Dense(4, activation="relu"))
model.add(Dense(1, activation="linear"))

Compile model

In [20]:
opt = Adam(lr=1e-3, decay=1e-3 / 200)
model.compile(loss="mean_absolute_percentage_error", optimizer=opt)

آموش مدل با داده‌های آموزشی

In [21]:
model.fit(trainX, trainY, validation_data=(testX, testY),
    epochs=200, batch_size=8)
Train on 271 samples, validate on 91 samples
Epoch 1/200
271/271 [==============================] - 3s 11ms/step - loss: 274.3629 - val_loss: 109.6800
Epoch 2/200
271/271 [==============================] - 0s 852us/step - loss: 81.8187 - val_loss: 79.1573
Epoch 3/200
271/271 [==============================] - 0s 856us/step - loss: 57.5993 - val_loss: 63.2764
Epoch 4/200
271/271 [==============================] - 0s 915us/step - loss: 46.6169 - val_loss: 58.3518
Epoch 5/200
271/271 [==============================] - 0s 896us/step - loss: 46.4683 - val_loss: 55.3489
Epoch 6/200
271/271 [==============================] - 0s 815us/step - loss: 41.2374 - val_loss: 52.9153
Epoch 7/200
271/271 [==============================] - 0s 752us/step - loss: 41.2185 - val_loss: 50.9960
Epoch 8/200
271/271 [==============================] - 0s 749us/step - loss: 40.7949 - val_loss: 48.3711
Epoch 9/200
271/271 [==============================] - 0s 755us/step - loss: 37.9040 - val_loss: 47.3093
Epoch 10/200
271/271 [==============================] - 0s 734us/step - loss: 37.5179 - val_loss: 46.0542
Epoch 11/200
271/271 [==============================] - 0s 848us/step - loss: 38.1799 - val_loss: 42.2384
Epoch 12/200
271/271 [==============================] - 0s 841us/step - loss: 33.0743 - val_loss: 40.7507
Epoch 13/200
271/271 [==============================] - 0s 856us/step - loss: 32.9037 - val_loss: 40.1074
Epoch 14/200
271/271 [==============================] - 0s 859us/step - loss: 32.4340 - val_loss: 38.6366
Epoch 15/200
271/271 [==============================] - 0s 852us/step - loss: 32.3434 - val_loss: 38.7734
Epoch 16/200
271/271 [==============================] - 0s 786us/step - loss: 31.7143 - val_loss: 38.4141
Epoch 17/200
271/271 [==============================] - 0s 727us/step - loss: 31.3374 - val_loss: 38.3170
Epoch 18/200
271/271 [==============================] - 0s 771us/step - loss: 31.3158 - val_loss: 37.9615
Epoch 19/200
271/271 [==============================] - 0s 786us/step - loss: 31.1686 - val_loss: 37.5458
Epoch 20/200
271/271 [==============================] - 0s 800us/step - loss: 30.6506 - val_loss: 38.1518
Epoch 21/200
271/271 [==============================] - 0s 830us/step - loss: 30.3962 - val_loss: 37.4007
Epoch 22/200
271/271 [==============================] - 0s 918us/step - loss: 30.1059 - val_loss: 38.0858
Epoch 23/200
271/271 [==============================] - 0s 892us/step - loss: 30.5002 - val_loss: 38.4182
Epoch 24/200
271/271 [==============================] - 0s 863us/step - loss: 30.7216 - val_loss: 37.2208
Epoch 25/200
271/271 [==============================] - 0s 730us/step - loss: 29.9758 - val_loss: 37.7301
Epoch 26/200
271/271 [==============================] - 0s 852us/step - loss: 30.2822 - val_loss: 38.1261
Epoch 27/200
271/271 [==============================] - 0s 719us/step - loss: 31.1608 - val_loss: 36.6486
Epoch 28/200
271/271 [==============================] - 0s 774us/step - loss: 29.1486 - val_loss: 36.3479
Epoch 29/200
271/271 [==============================] - 0s 845us/step - loss: 29.1896 - val_loss: 35.6131
Epoch 30/200
271/271 [==============================] - 0s 845us/step - loss: 30.0241 - val_loss: 36.8921
Epoch 31/200
271/271 [==============================] - 0s 859us/step - loss: 30.0597 - val_loss: 36.7154
Epoch 32/200
271/271 [==============================] - 0s 848us/step - loss: 29.9695 - val_loss: 36.1278
Epoch 33/200
271/271 [==============================] - 0s 826us/step - loss: 28.5626 - val_loss: 35.8382
Epoch 34/200
271/271 [==============================] - 0s 734us/step - loss: 29.4202 - val_loss: 35.6527
Epoch 35/200
271/271 [==============================] - 0s 811us/step - loss: 28.9315 - val_loss: 36.7020
Epoch 36/200
271/271 [==============================] - 0s 837us/step - loss: 29.5862 - val_loss: 36.0291
Epoch 37/200
271/271 [==============================] - 0s 852us/step - loss: 30.0635 - val_loss: 36.1580
Epoch 38/200
271/271 [==============================] - 0s 1ms/step - loss: 30.1257 - val_loss: 35.6447
Epoch 39/200
271/271 [==============================] - 0s 966us/step - loss: 28.5957 - val_loss: 35.4753
Epoch 40/200
271/271 [==============================] - 0s 970us/step - loss: 29.1557 - val_loss: 36.6803
Epoch 41/200
271/271 [==============================] - 0s 937us/step - loss: 29.9257 - val_loss: 35.9405
Epoch 42/200
271/271 [==============================] - 0s 922us/step - loss: 28.8982 - val_loss: 35.5165
Epoch 43/200
271/271 [==============================] - 0s 973us/step - loss: 28.7194 - val_loss: 36.8679
Epoch 44/200
271/271 [==============================] - 0s 896us/step - loss: 29.6274 - val_loss: 35.4251
Epoch 45/200
271/271 [==============================] - 0s 837us/step - loss: 28.5909 - val_loss: 34.9274
Epoch 46/200
271/271 [==============================] - 0s 977us/step - loss: 28.5849 - val_loss: 35.4455
Epoch 47/200
271/271 [==============================] - 0s 985us/step - loss: 29.8543 - val_loss: 35.1163
Epoch 48/200
271/271 [==============================] - 0s 951us/step - loss: 28.4919 - val_loss: 35.5199
Epoch 49/200
271/271 [==============================] - 0s 974us/step - loss: 28.6055 - val_loss: 36.7568
Epoch 50/200
271/271 [==============================] - 0s 892us/step - loss: 29.3425 - val_loss: 34.6022
Epoch 51/200
271/271 [==============================] - 0s 782us/step - loss: 28.8814 - val_loss: 34.8447
Epoch 52/200
271/271 [==============================] - 0s 856us/step - loss: 29.2900 - val_loss: 35.0079
Epoch 53/200
271/271 [==============================] - 0s 786us/step - loss: 29.5400 - val_loss: 34.9382
Epoch 54/200
271/271 [==============================] - 0s 874us/step - loss: 28.5073 - val_loss: 35.8380
Epoch 55/200
271/271 [==============================] - 0s 830us/step - loss: 28.7576 - val_loss: 34.2087
Epoch 56/200
271/271 [==============================] - 0s 867us/step - loss: 28.5596 - val_loss: 34.5693
Epoch 57/200
271/271 [==============================] - 0s 937us/step - loss: 29.4641 - val_loss: 34.5786
Epoch 58/200
271/271 [==============================] - 0s 944us/step - loss: 28.8239 - val_loss: 35.1018
Epoch 59/200
271/271 [==============================] - 0s 789us/step - loss: 28.2958 - val_loss: 35.3880
Epoch 60/200
271/271 [==============================] - 0s 760us/step - loss: 28.5673 - val_loss: 35.7874
Epoch 61/200
271/271 [==============================] - 0s 774us/step - loss: 29.4349 - val_loss: 36.7657
Epoch 62/200
271/271 [==============================] - 0s 767us/step - loss: 28.7991 - val_loss: 36.6966
Epoch 63/200
271/271 [==============================] - 0s 822us/step - loss: 29.0289 - val_loss: 34.5065
Epoch 64/200
271/271 [==============================] - 0s 896us/step - loss: 28.6727 - val_loss: 36.2448
Epoch 65/200
271/271 [==============================] - 0s 1ms/step - loss: 29.3485 - val_loss: 35.8350
Epoch 66/200
271/271 [==============================] - 0s 929us/step - loss: 29.2130 - val_loss: 35.9859
Epoch 67/200
271/271 [==============================] - 0s 863us/step - loss: 29.3722 - val_loss: 34.7409
Epoch 68/200
271/271 [==============================] - 0s 719us/step - loss: 29.1651 - val_loss: 35.7246
Epoch 69/200
271/271 [==============================] - 0s 822us/step - loss: 29.1330 - val_loss: 35.3774
Epoch 70/200
271/271 [==============================] - 0s 845us/step - loss: 28.1200 - val_loss: 35.1389
Epoch 71/200
271/271 [==============================] - 0s 826us/step - loss: 30.3745 - val_loss: 35.1621
Epoch 72/200
271/271 [==============================] - 0s 870us/step - loss: 29.7780 - val_loss: 36.2803
Epoch 73/200
271/271 [==============================] - 0s 915us/step - loss: 29.0088 - val_loss: 35.3808
Epoch 74/200
271/271 [==============================] - 0s 852us/step - loss: 28.7546 - val_loss: 35.0129
Epoch 75/200
271/271 [==============================] - 0s 922us/step - loss: 28.3806 - val_loss: 35.6722
Epoch 76/200
271/271 [==============================] - 0s 830us/step - loss: 28.4555 - val_loss: 35.6889
Epoch 77/200
271/271 [==============================] - 0s 811us/step - loss: 29.3141 - val_loss: 35.8317
Epoch 78/200
271/271 [==============================] - 0s 730us/step - loss: 28.5560 - val_loss: 34.8651
Epoch 79/200
271/271 [==============================] - 0s 734us/step - loss: 28.3928 - val_loss: 35.6066
Epoch 80/200
271/271 [==============================] - 0s 697us/step - loss: 29.1012 - val_loss: 36.1429
Epoch 81/200
271/271 [==============================] - 0s 749us/step - loss: 28.6560 - val_loss: 35.5072
Epoch 82/200
271/271 [==============================] - 0s 804us/step - loss: 28.8311 - val_loss: 36.2556
Epoch 83/200
271/271 [==============================] - 0s 811us/step - loss: 28.5464 - val_loss: 35.2115
Epoch 84/200
271/271 [==============================] - 0s 826us/step - loss: 28.0626 - val_loss: 37.0662
Epoch 85/200
271/271 [==============================] - 0s 767us/step - loss: 28.5353 - val_loss: 35.1223
Epoch 86/200
271/271 [==============================] - 0s 612us/step - loss: 28.2699 - val_loss: 36.5861
Epoch 87/200
271/271 [==============================] - 0s 664us/step - loss: 28.7683 - val_loss: 36.0658
Epoch 88/200
271/271 [==============================] - 0s 693us/step - loss: 28.6031 - val_loss: 37.1731
Epoch 89/200
271/271 [==============================] - 0s 734us/step - loss: 28.4059 - val_loss: 34.6264
Epoch 90/200
271/271 [==============================] - 0s 734us/step - loss: 29.3826 - val_loss: 36.8670
Epoch 91/200
271/271 [==============================] - 0s 760us/step - loss: 28.7343 - val_loss: 36.1997
Epoch 92/200
271/271 [==============================] - 0s 774us/step - loss: 27.9420 - val_loss: 35.6907
Epoch 93/200
271/271 [==============================] - 0s 789us/step - loss: 28.4056 - val_loss: 35.9340
Epoch 94/200
271/271 [==============================] - 0s 771us/step - loss: 29.8033 - val_loss: 35.0321
Epoch 95/200
271/271 [==============================] - 0s 822us/step - loss: 28.6592 - val_loss: 34.5973
Epoch 96/200
271/271 [==============================] - 0s 620us/step - loss: 28.2901 - val_loss: 35.5223
Epoch 97/200
271/271 [==============================] - 0s 730us/step - loss: 28.7182 - val_loss: 36.1289
Epoch 98/200
271/271 [==============================] - 0s 679us/step - loss: 29.0689 - val_loss: 36.2295
Epoch 99/200
271/271 [==============================] - 0s 734us/step - loss: 28.5465 - val_loss: 34.9149
Epoch 100/200
271/271 [==============================] - 0s 749us/step - loss: 28.0349 - val_loss: 35.6574
Epoch 101/200
271/271 [==============================] - 0s 708us/step - loss: 28.6254 - val_loss: 35.6179
Epoch 102/200
271/271 [==============================] - 0s 797us/step - loss: 28.1658 - val_loss: 34.7164
Epoch 103/200
271/271 [==============================] - 0s 774us/step - loss: 27.7563 - val_loss: 34.8238
Epoch 104/200
271/271 [==============================] - 0s 811us/step - loss: 28.8107 - val_loss: 35.2860
Epoch 105/200
271/271 [==============================] - 0s 804us/step - loss: 28.7350 - val_loss: 36.6296
Epoch 106/200
271/271 [==============================] - 0s 645us/step - loss: 28.2885 - val_loss: 37.2982
Epoch 107/200
271/271 [==============================] - 0s 649us/step - loss: 29.1292 - val_loss: 36.7109
Epoch 108/200
271/271 [==============================] - 0s 704us/step - loss: 29.5676 - val_loss: 34.8100
Epoch 109/200
271/271 [==============================] - 0s 704us/step - loss: 28.5612 - val_loss: 36.1396
Epoch 110/200
271/271 [==============================] - 0s 745us/step - loss: 28.2348 - val_loss: 34.9619
Epoch 111/200
271/271 [==============================] - 0s 786us/step - loss: 28.1678 - val_loss: 35.6697
Epoch 112/200
271/271 [==============================] - 0s 904us/step - loss: 29.4567 - val_loss: 34.8336
Epoch 113/200
271/271 [==============================] - 0s 826us/step - loss: 28.7854 - val_loss: 35.6134
Epoch 114/200
271/271 [==============================] - 0s 800us/step - loss: 27.9338 - val_loss: 35.0001
Epoch 115/200
271/271 [==============================] - 0s 826us/step - loss: 28.8813 - val_loss: 35.9869
Epoch 116/200
271/271 [==============================] - 0s 723us/step - loss: 28.8484 - val_loss: 35.4887
Epoch 117/200
271/271 [==============================] - 0s 653us/step - loss: 28.8066 - val_loss: 35.1312
Epoch 118/200
271/271 [==============================] - 0s 712us/step - loss: 28.4494 - val_loss: 34.7813
Epoch 119/200
271/271 [==============================] - 0s 730us/step - loss: 28.0864 - val_loss: 34.6969
Epoch 120/200
271/271 [==============================] - 0s 756us/step - loss: 28.3474 - val_loss: 35.3573
Epoch 121/200
271/271 [==============================] - 0s 804us/step - loss: 28.0311 - val_loss: 34.2595
Epoch 122/200
271/271 [==============================] - 0s 815us/step - loss: 28.2223 - val_loss: 34.6867
Epoch 123/200
271/271 [==============================] - 0s 833us/step - loss: 28.2812 - val_loss: 34.2276
Epoch 124/200
271/271 [==============================] - 0s 793us/step - loss: 27.9038 - val_loss: 34.9292
Epoch 125/200
271/271 [==============================] - 0s 752us/step - loss: 28.0054 - val_loss: 34.1833
Epoch 126/200
271/271 [==============================] - 0s 760us/step - loss: 28.3382 - val_loss: 34.9324
Epoch 127/200
271/271 [==============================] - 0s 616us/step - loss: 28.4073 - val_loss: 35.4488
Epoch 128/200
271/271 [==============================] - 0s 734us/step - loss: 27.5672 - val_loss: 34.4673
Epoch 129/200
271/271 [==============================] - 0s 723us/step - loss: 28.2534 - val_loss: 36.3752
Epoch 130/200
271/271 [==============================] - 0s 756us/step - loss: 28.3115 - val_loss: 34.3845
Epoch 131/200
271/271 [==============================] - 0s 830us/step - loss: 28.0234 - val_loss: 35.7207
Epoch 132/200
271/271 [==============================] - 0s 856us/step - loss: 30.7974 - val_loss: 35.5416
Epoch 133/200
271/271 [==============================] - 0s 782us/step - loss: 28.3088 - val_loss: 34.9732
Epoch 134/200
271/271 [==============================] - 0s 789us/step - loss: 27.9249 - val_loss: 35.0184
Epoch 135/200
271/271 [==============================] - 0s 738us/step - loss: 28.5611 - val_loss: 35.2841
Epoch 136/200
271/271 [==============================] - 0s 701us/step - loss: 27.9119 - val_loss: 35.5909
Epoch 137/200
271/271 [==============================] - 0s 690us/step - loss: 27.7818 - val_loss: 34.7712
Epoch 138/200
271/271 [==============================] - 0s 811us/step - loss: 28.2153 - val_loss: 34.8878
Epoch 139/200
271/271 [==============================] - 0s 620us/step - loss: 28.1936 - val_loss: 34.5620
Epoch 140/200
271/271 [==============================] - 0s 767us/step - loss: 29.1327 - val_loss: 34.3295
Epoch 141/200
271/271 [==============================] - 0s 811us/step - loss: 28.1130 - val_loss: 35.4334
Epoch 142/200
271/271 [==============================] - 0s 793us/step - loss: 27.6401 - val_loss: 35.8526
Epoch 143/200
271/271 [==============================] - 0s 797us/step - loss: 28.8755 - val_loss: 34.9421
Epoch 144/200
271/271 [==============================] - 0s 826us/step - loss: 28.4211 - val_loss: 35.3376
Epoch 145/200
271/271 [==============================] - 0s 712us/step - loss: 27.6346 - val_loss: 34.6086
Epoch 146/200
271/271 [==============================] - 0s 631us/step - loss: 27.7236 - val_loss: 34.6648
Epoch 147/200
271/271 [==============================] - 0s 679us/step - loss: 28.5931 - val_loss: 35.5566
Epoch 148/200
271/271 [==============================] - 0s 752us/step - loss: 27.7323 - val_loss: 34.5734
Epoch 149/200
271/271 [==============================] - 0s 704us/step - loss: 27.7599 - val_loss: 34.6223
Epoch 150/200
271/271 [==============================] - 0s 808us/step - loss: 28.7277 - val_loss: 36.7365
Epoch 151/200
271/271 [==============================] - 0s 774us/step - loss: 29.3655 - val_loss: 34.8086
Epoch 152/200
271/271 [==============================] - 0s 767us/step - loss: 28.5911 - val_loss: 34.3407
Epoch 153/200
271/271 [==============================] - 0s 815us/step - loss: 28.0147 - val_loss: 34.4950
Epoch 154/200
271/271 [==============================] - 0s 837us/step - loss: 28.4955 - val_loss: 34.7144
Epoch 155/200
271/271 [==============================] - 0s 679us/step - loss: 28.0506 - val_loss: 35.0608
Epoch 156/200
271/271 [==============================] - 0s 708us/step - loss: 28.4803 - val_loss: 34.4899
Epoch 157/200
271/271 [==============================] - 0s 752us/step - loss: 27.7605 - val_loss: 35.5914
Epoch 158/200
271/271 [==============================] - 0s 612us/step - loss: 27.6416 - val_loss: 34.4462
Epoch 159/200
271/271 [==============================] - 0s 749us/step - loss: 28.2755 - val_loss: 34.8035
Epoch 160/200
271/271 [==============================] - 0s 789us/step - loss: 28.0464 - val_loss: 34.8777
Epoch 161/200
271/271 [==============================] - 0s 786us/step - loss: 27.8735 - val_loss: 34.5049
Epoch 162/200
271/271 [==============================] - 0s 800us/step - loss: 27.5493 - val_loss: 34.6863
Epoch 163/200
271/271 [==============================] - 0s 822us/step - loss: 28.2082 - val_loss: 35.1853
Epoch 164/200
271/271 [==============================] - 0s 797us/step - loss: 29.6458 - val_loss: 34.2768
Epoch 165/200
271/271 [==============================] - 0s 697us/step - loss: 28.3591 - val_loss: 36.0058
Epoch 166/200
271/271 [==============================] - 0s 660us/step - loss: 28.7876 - val_loss: 36.5060
Epoch 167/200
271/271 [==============================] - 0s 712us/step - loss: 28.7109 - val_loss: 34.9671
Epoch 168/200
271/271 [==============================] - 0s 690us/step - loss: 28.0675 - val_loss: 36.1290
Epoch 169/200
271/271 [==============================] - 0s 727us/step - loss: 29.3283 - val_loss: 35.3012
Epoch 170/200
271/271 [==============================] - 0s 793us/step - loss: 28.2303 - val_loss: 34.7625
Epoch 171/200
271/271 [==============================] - 0s 763us/step - loss: 28.1528 - val_loss: 34.4396
Epoch 172/200
271/271 [==============================] - 0s 774us/step - loss: 27.3404 - val_loss: 34.5707
Epoch 173/200
271/271 [==============================] - 0s 800us/step - loss: 28.2543 - val_loss: 34.9200
Epoch 174/200
271/271 [==============================] - 0s 727us/step - loss: 27.7098 - val_loss: 34.5400
Epoch 175/200
271/271 [==============================] - 0s 642us/step - loss: 27.7488 - val_loss: 34.9666
Epoch 176/200
271/271 [==============================] - 0s 649us/step - loss: 28.2577 - val_loss: 34.5095
Epoch 177/200
271/271 [==============================] - 0s 708us/step - loss: 27.7456 - val_loss: 34.4621
Epoch 178/200
271/271 [==============================] - 0s 671us/step - loss: 28.0341 - val_loss: 34.4388
Epoch 179/200
271/271 [==============================] - 0s 719us/step - loss: 28.0120 - val_loss: 34.4689
Epoch 180/200
271/271 [==============================] - 0s 760us/step - loss: 27.4960 - val_loss: 34.3378
Epoch 181/200
271/271 [==============================] - 0s 789us/step - loss: 28.2026 - val_loss: 34.6268
Epoch 182/200
271/271 [==============================] - 0s 767us/step - loss: 27.4339 - val_loss: 34.6973
Epoch 183/200
271/271 [==============================] - 0s 815us/step - loss: 28.4992 - val_loss: 34.7024
Epoch 184/200
271/271 [==============================] - 0s 749us/step - loss: 28.3122 - val_loss: 34.5392
Epoch 185/200
271/271 [==============================] - 0s 653us/step - loss: 28.6185 - val_loss: 35.5946
Epoch 186/200
271/271 [==============================] - 0s 701us/step - loss: 28.5614 - val_loss: 35.1319
Epoch 187/200
271/271 [==============================] - 0s 693us/step - loss: 27.9678 - val_loss: 36.6366
Epoch 188/200
271/271 [==============================] - 0s 668us/step - loss: 28.0846 - val_loss: 36.4053
Epoch 189/200
271/271 [==============================] - 0s 723us/step - loss: 28.3186 - val_loss: 34.6923
Epoch 190/200
271/271 [==============================] - 0s 741us/step - loss: 27.6466 - val_loss: 34.3430
Epoch 191/200
271/271 [==============================] - 0s 822us/step - loss: 28.0437 - val_loss: 35.4831
Epoch 192/200
271/271 [==============================] - 0s 845us/step - loss: 28.7008 - val_loss: 34.6764
Epoch 193/200
271/271 [==============================] - 0s 771us/step - loss: 28.3251 - val_loss: 34.0047
Epoch 194/200
271/271 [==============================] - 0s 804us/step - loss: 28.0629 - val_loss: 35.7091
Epoch 195/200
271/271 [==============================] - 0s 686us/step - loss: 27.6946 - val_loss: 35.1283
Epoch 196/200
271/271 [==============================] - 0s 671us/step - loss: 27.9952 - val_loss: 34.9689
Epoch 197/200
271/271 [==============================] - 0s 719us/step - loss: 28.2978 - val_loss: 33.7529
Epoch 198/200
271/271 [==============================] - 0s 752us/step - loss: 27.6831 - val_loss: 33.7525
Epoch 199/200
271/271 [==============================] - 0s 690us/step - loss: 28.2086 - val_loss: 35.0329
Epoch 200/200
271/271 [==============================] - 0s 786us/step - loss: 27.7026 - val_loss: 34.3507
Out[21]:
<keras.callbacks.History at 0x214f418deb8>
In [22]:
preds = model.predict(testX)
In [30]:
# make predictions on the testing data
preds = model.predict(testX)
 
# compute the difference between the *predicted* house prices and the
# *actual* house prices, then compute the percentage difference and
# the absolute percentage difference
diff = preds.flatten() - testY
percentDiff = (diff / testY) * 100
absPercentDiff = np.abs(percentDiff)
 
# compute the mean and standard deviation of the absolute percentage
# difference
mean = np.mean(absPercentDiff)
std = np.std(absPercentDiff)
 
# finally, show some statistics on our model
locale.setlocale(locale.LC_ALL, "en_US.UTF-8")
print("avg. house price: {}, std house price: {}".format(
    locale.currency(df["price"].mean(), grouping=True),
    locale.currency(df["price"].std(), grouping=True)))
print("mean: {:.2f}%, std: {:.2f}%".format(mean, std))
avg. house price: $533,388.27, std house price: $493,403.08
mean: 34.35%, std: 34.27%
دوره مقدماتی یادگیری عمیق
علیرضا اخوان پور
پنج شنبه، ۲۵ بهمن ۱۳۹۷
Class.Vision - AkhavanPour.ir - GitHub