Deep Learning Hello World In Keras Part3

Assignment 1-c: Deep Learning Hello World! (3-layer MLP+Dropout)

Objective: To be able to improve upon the 3-layer MLP in part 2 by adding dropout for MNIST Classification

Step 1: Taking care of the imports which includes numpy, datasets, models, layers, optimizers, and utils.
You will also be able to tell if your set-up is correct/complete.

from __future__ import print_function
import numpy as np
from keras.datasets import mnist
from keras.models import Sequential
from keras.models import model_from_json
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import SGD
from keras.utils import np_utils

from matplotlib import pyplot as plt
%matplotlib inline

Step 2: Set-up some constants to be utilized in the training/testing of the model
Note: the number of epochs (NB_EPOCH) is increased to 250

NB_EPOCH = 250
BATCH_SIZE = 128
VERBOSE = 1
NB_CLASSES = 10   # number of outputs = number of digits, i.e. 0,1,2,3,4,5,6,7,8,9
OPTIMIZER = SGD() # Stocastic Gradient Descent optimizer
N_HIDDEN = 128
VALIDATION_SPLIT=0.2 # how much TRAIN dataset is reserved for VALIDATION

DROPOUT = 0.3

np.random.seed(1983)  # for reproducibility

Step 3: Load the MNIST Dataset which are shuffled and split between train and test sets <br>

  • X_train is 60000 rows of 28x28 values
  • X_test is 10000 rows of 28x28 values
(X_train, y_train), (X_test, y_test) = mnist.load_data()
print("First 100 train images:")
for k in range(100):
    plt.subplot(10, 10, k+1)      
    plt.gca().axes.get_yaxis().set_visible(False)
    plt.gca().axes.get_xaxis().set_visible(False)
    plt.imshow(X_train[k])
First 100 train images:

_config.yml

Step 4: Preprocess the input data by reshaping it, converting it to float, and normalizing it [0-1].

# reshape
X_train = X_train.reshape(60000, 784)
X_test = X_test.reshape(10000, 784)
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')

# normalize 
X_train /= 255
X_test /= 255

print(X_train.shape, 'train samples')
print(X_test.shape, 'test samples')
(60000, 784) train samples
(10000, 784) test samples

Step 5: Convert class vectors to binary class matrices; One-Hot-Encoding (OHE)

Y_train = np_utils.to_categorical(y_train, NB_CLASSES)
Y_test = np_utils.to_categorical(y_test, NB_CLASSES)

Step 6: Create the model with 3 layers: Input:784 ==> Hidden:128 w/ dropout ==> Hidden:128 w/ dropout ==> Output:10 (with Softmax activation)

model = Sequential()
model.add(Dense(N_HIDDEN, input_shape=(784,)))
model.add(Activation('relu'))
model.add(Dropout(DROPOUT))
model.add(Dense(N_HIDDEN))
model.add(Activation('relu'))
model.add(Dropout(DROPOUT))
model.add(Dense(NB_CLASSES))
model.add(Activation('softmax'))
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_2 (Dense)              (None, 128)               100480    
_________________________________________________________________
activation_2 (Activation)    (None, 128)               0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 128)               0         
_________________________________________________________________
dense_3 (Dense)              (None, 128)               16512     
_________________________________________________________________
activation_3 (Activation)    (None, 128)               0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 128)               0         
_________________________________________________________________
dense_4 (Dense)              (None, 10)                1290      
_________________________________________________________________
activation_4 (Activation)    (None, 10)                0         
=================================================================
Total params: 118,282
Trainable params: 118,282
Non-trainable params: 0
_________________________________________________________________

Step 7: Compile the model with categorical_crossentropy loss function, SGD optimizer, and accuracy metric

model.compile(loss='categorical_crossentropy',
              optimizer=OPTIMIZER,
              metrics=['accuracy'])

Step 8: Perform the training with 128 batch size, 250 epochs, and 20 % of the train data used for validation

history = model.fit(X_train, Y_train,
                    batch_size=BATCH_SIZE, epochs=NB_EPOCH,
                    verbose=VERBOSE, validation_split=VALIDATION_SPLIT)
Train on 48000 samples, validate on 12000 samples
Epoch 1/250
48000/48000 [==============================] - 6s - loss: 1.6749 - acc: 0.4718 - val_loss: 0.8680 - val_acc: 0.8195
Epoch 2/250
48000/48000 [==============================] - 3s - loss: 0.9041 - acc: 0.7247 - val_loss: 0.5284 - val_acc: 0.8703
Epoch 3/250
48000/48000 [==============================] - 3s - loss: 0.6895 - acc: 0.7928 - val_loss: 0.4267 - val_acc: 0.8882
Epoch 4/250
48000/48000 [==============================] - 3s - loss: 0.5923 - acc: 0.8208 - val_loss: 0.3753 - val_acc: 0.8961
Epoch 5/250
48000/48000 [==============================] - 3s - loss: 0.5308 - acc: 0.8414 - val_loss: 0.3412 - val_acc: 0.9035
Epoch 6/250
48000/48000 [==============================] - 2s - loss: 0.4902 - acc: 0.8550 - val_loss: 0.3201 - val_acc: 0.9089
Epoch 7/250
48000/48000 [==============================] - 2s - loss: 0.4649 - acc: 0.8622 - val_loss: 0.3026 - val_acc: 0.9119
Epoch 8/250
48000/48000 [==============================] - 2s - loss: 0.4366 - acc: 0.8709 - val_loss: 0.2888 - val_acc: 0.9167
Epoch 9/250
48000/48000 [==============================] - 2s - loss: 0.4156 - acc: 0.8783 - val_loss: 0.2758 - val_acc: 0.9202
Epoch 10/250
48000/48000 [==============================] - 2s - loss: 0.4003 - acc: 0.8814 - val_loss: 0.2654 - val_acc: 0.9227
Epoch 11/250
48000/48000 [==============================] - 2s - loss: 0.3826 - acc: 0.8871 - val_loss: 0.2560 - val_acc: 0.9253
Epoch 12/250
48000/48000 [==============================] - 2s - loss: 0.3693 - acc: 0.8915 - val_loss: 0.2468 - val_acc: 0.9271
Epoch 13/250
48000/48000 [==============================] - 2s - loss: 0.3560 - acc: 0.8951 - val_loss: 0.2385 - val_acc: 0.9303
Epoch 14/250
48000/48000 [==============================] - 2s - loss: 0.3476 - acc: 0.8983 - val_loss: 0.2321 - val_acc: 0.9338
Epoch 15/250
48000/48000 [==============================] - 2s - loss: 0.3325 - acc: 0.9030 - val_loss: 0.2272 - val_acc: 0.9344
Epoch 16/250
48000/48000 [==============================] - 2s - loss: 0.3266 - acc: 0.9054 - val_loss: 0.2201 - val_acc: 0.9364
Epoch 17/250
48000/48000 [==============================] - 2s - loss: 0.3180 - acc: 0.9072 - val_loss: 0.2137 - val_acc: 0.9383
Epoch 18/250
48000/48000 [==============================] - 2s - loss: 0.3099 - acc: 0.9082 - val_loss: 0.2086 - val_acc: 0.9401
Epoch 19/250
48000/48000 [==============================] - 3s - loss: 0.3027 - acc: 0.9102 - val_loss: 0.2043 - val_acc: 0.9410
Epoch 20/250
48000/48000 [==============================] - 3s - loss: 0.2963 - acc: 0.9124 - val_loss: 0.1994 - val_acc: 0.9417
Epoch 21/250
48000/48000 [==============================] - 2s - loss: 0.2892 - acc: 0.9163 - val_loss: 0.1959 - val_acc: 0.9432
Epoch 22/250
48000/48000 [==============================] - 3s - loss: 0.2819 - acc: 0.9177 - val_loss: 0.1911 - val_acc: 0.9449
Epoch 23/250
48000/48000 [==============================] - 3s - loss: 0.2774 - acc: 0.9198 - val_loss: 0.1874 - val_acc: 0.9457
Epoch 24/250
48000/48000 [==============================] - 3s - loss: 0.2720 - acc: 0.9200 - val_loss: 0.1832 - val_acc: 0.9480
Epoch 25/250
48000/48000 [==============================] - 3s - loss: 0.2668 - acc: 0.9217 - val_loss: 0.1803 - val_acc: 0.9485
Epoch 26/250
48000/48000 [==============================] - 3s - loss: 0.2611 - acc: 0.9242 - val_loss: 0.1788 - val_acc: 0.9486
Epoch 27/250
48000/48000 [==============================] - 3s - loss: 0.2575 - acc: 0.9251 - val_loss: 0.1745 - val_acc: 0.9500
Epoch 28/250
48000/48000 [==============================] - 3s - loss: 0.2568 - acc: 0.9256 - val_loss: 0.1724 - val_acc: 0.9506
Epoch 29/250
48000/48000 [==============================] - 3s - loss: 0.2484 - acc: 0.9276 - val_loss: 0.1690 - val_acc: 0.9512
Epoch 30/250
48000/48000 [==============================] - 3s - loss: 0.2435 - acc: 0.9288 - val_loss: 0.1663 - val_acc: 0.9520
Epoch 31/250
48000/48000 [==============================] - 3s - loss: 0.2410 - acc: 0.9296 - val_loss: 0.1633 - val_acc: 0.9533
Epoch 32/250
48000/48000 [==============================] - 3s - loss: 0.2379 - acc: 0.9305 - val_loss: 0.1621 - val_acc: 0.9532
Epoch 33/250
48000/48000 [==============================] - 3s - loss: 0.2324 - acc: 0.9320 - val_loss: 0.1596 - val_acc: 0.9541
Epoch 34/250
48000/48000 [==============================] - 3s - loss: 0.2294 - acc: 0.9321 - val_loss: 0.1566 - val_acc: 0.9549
Epoch 35/250
48000/48000 [==============================] - 3s - loss: 0.2271 - acc: 0.9337 - val_loss: 0.1548 - val_acc: 0.9558
Epoch 36/250
48000/48000 [==============================] - 3s - loss: 0.2236 - acc: 0.9340 - val_loss: 0.1531 - val_acc: 0.9558
Epoch 37/250
48000/48000 [==============================] - 3s - loss: 0.2169 - acc: 0.9361 - val_loss: 0.1507 - val_acc: 0.9567
Epoch 38/250
48000/48000 [==============================] - 3s - loss: 0.2156 - acc: 0.9357 - val_loss: 0.1486 - val_acc: 0.9568
Epoch 39/250
48000/48000 [==============================] - 3s - loss: 0.2143 - acc: 0.9372 - val_loss: 0.1466 - val_acc: 0.9584
Epoch 40/250
48000/48000 [==============================] - 3s - loss: 0.2098 - acc: 0.9381 - val_loss: 0.1466 - val_acc: 0.9573
Epoch 41/250
48000/48000 [==============================] - 3s - loss: 0.2057 - acc: 0.9391 - val_loss: 0.1441 - val_acc: 0.9585
Epoch 42/250
48000/48000 [==============================] - 3s - loss: 0.2055 - acc: 0.9395 - val_loss: 0.1424 - val_acc: 0.9593
Epoch 43/250
48000/48000 [==============================] - 3s - loss: 0.2044 - acc: 0.9395 - val_loss: 0.1406 - val_acc: 0.9593
Epoch 44/250
48000/48000 [==============================] - 3s - loss: 0.1988 - acc: 0.9424 - val_loss: 0.1395 - val_acc: 0.9603
Epoch 45/250
48000/48000 [==============================] - 3s - loss: 0.1979 - acc: 0.9421 - val_loss: 0.1381 - val_acc: 0.9601
Epoch 46/250
48000/48000 [==============================] - 3s - loss: 0.1939 - acc: 0.9416 - val_loss: 0.1358 - val_acc: 0.9609
Epoch 47/250
48000/48000 [==============================] - 3s - loss: 0.1955 - acc: 0.9436 - val_loss: 0.1349 - val_acc: 0.9613
Epoch 48/250
48000/48000 [==============================] - 2s - loss: 0.1907 - acc: 0.9437 - val_loss: 0.1338 - val_acc: 0.9612
Epoch 49/250
48000/48000 [==============================] - 3s - loss: 0.1919 - acc: 0.9425 - val_loss: 0.1324 - val_acc: 0.9619
Epoch 50/250
48000/48000 [==============================] - 2s - loss: 0.1848 - acc: 0.9450 - val_loss: 0.1314 - val_acc: 0.9617
Epoch 51/250
48000/48000 [==============================] - 2s - loss: 0.1839 - acc: 0.9455 - val_loss: 0.1303 - val_acc: 0.9623
Epoch 52/250
48000/48000 [==============================] - 2s - loss: 0.1817 - acc: 0.9460 - val_loss: 0.1289 - val_acc: 0.9630
Epoch 53/250
48000/48000 [==============================] - 2s - loss: 0.1841 - acc: 0.9446 - val_loss: 0.1281 - val_acc: 0.9632
Epoch 54/250
48000/48000 [==============================] - 2s - loss: 0.1810 - acc: 0.9475 - val_loss: 0.1264 - val_acc: 0.9638
Epoch 55/250
48000/48000 [==============================] - 2s - loss: 0.1749 - acc: 0.9489 - val_loss: 0.1266 - val_acc: 0.9635
Epoch 56/250
48000/48000 [==============================] - 2s - loss: 0.1752 - acc: 0.9479 - val_loss: 0.1255 - val_acc: 0.9635
Epoch 57/250
48000/48000 [==============================] - 2s - loss: 0.1746 - acc: 0.9500 - val_loss: 0.1242 - val_acc: 0.9642
Epoch 58/250
48000/48000 [==============================] - 2s - loss: 0.1709 - acc: 0.9500 - val_loss: 0.1230 - val_acc: 0.9645
Epoch 59/250
48000/48000 [==============================] - 2s - loss: 0.1680 - acc: 0.9506 - val_loss: 0.1216 - val_acc: 0.9642
Epoch 60/250
48000/48000 [==============================] - 2s - loss: 0.1660 - acc: 0.9496 - val_loss: 0.1211 - val_acc: 0.9651
Epoch 61/250
48000/48000 [==============================] - 3s - loss: 0.1658 - acc: 0.9506 - val_loss: 0.1205 - val_acc: 0.9650
Epoch 62/250
48000/48000 [==============================] - 3s - loss: 0.1660 - acc: 0.9516 - val_loss: 0.1189 - val_acc: 0.9657
Epoch 63/250
48000/48000 [==============================] - 2s - loss: 0.1638 - acc: 0.9523 - val_loss: 0.1183 - val_acc: 0.9662
Epoch 64/250
48000/48000 [==============================] - 2s - loss: 0.1635 - acc: 0.9517 - val_loss: 0.1183 - val_acc: 0.9658
Epoch 65/250
48000/48000 [==============================] - 2s - loss: 0.1606 - acc: 0.9521 - val_loss: 0.1166 - val_acc: 0.9664
Epoch 66/250
48000/48000 [==============================] - 2s - loss: 0.1590 - acc: 0.9530 - val_loss: 0.1159 - val_acc: 0.9662
Epoch 67/250
48000/48000 [==============================] - 2s - loss: 0.1553 - acc: 0.9541 - val_loss: 0.1159 - val_acc: 0.9662
Epoch 68/250
48000/48000 [==============================] - 2s - loss: 0.1554 - acc: 0.9531 - val_loss: 0.1145 - val_acc: 0.9667
Epoch 69/250
48000/48000 [==============================] - 2s - loss: 0.1550 - acc: 0.9549 - val_loss: 0.1142 - val_acc: 0.9671
Epoch 70/250
48000/48000 [==============================] - 2s - loss: 0.1551 - acc: 0.9547 - val_loss: 0.1130 - val_acc: 0.9676
Epoch 71/250
48000/48000 [==============================] - 3s - loss: 0.1508 - acc: 0.9562 - val_loss: 0.1121 - val_acc: 0.9670
Epoch 72/250
48000/48000 [==============================] - 2s - loss: 0.1505 - acc: 0.9556 - val_loss: 0.1121 - val_acc: 0.9674
Epoch 73/250
48000/48000 [==============================] - 2s - loss: 0.1524 - acc: 0.9545 - val_loss: 0.1113 - val_acc: 0.9673
Epoch 74/250
48000/48000 [==============================] - 2s - loss: 0.1518 - acc: 0.9546 - val_loss: 0.1107 - val_acc: 0.9671
Epoch 75/250
48000/48000 [==============================] - 3s - loss: 0.1482 - acc: 0.9565 - val_loss: 0.1099 - val_acc: 0.9673
Epoch 76/250
48000/48000 [==============================] - 3s - loss: 0.1461 - acc: 0.9571 - val_loss: 0.1093 - val_acc: 0.9674
Epoch 77/250
48000/48000 [==============================] - 3s - loss: 0.1457 - acc: 0.9561 - val_loss: 0.1094 - val_acc: 0.9677
Epoch 78/250
48000/48000 [==============================] - 3s - loss: 0.1450 - acc: 0.9575 - val_loss: 0.1080 - val_acc: 0.9673
Epoch 79/250
48000/48000 [==============================] - 3s - loss: 0.1430 - acc: 0.9577 - val_loss: 0.1078 - val_acc: 0.9683
Epoch 80/250
48000/48000 [==============================] - 3s - loss: 0.1423 - acc: 0.9580 - val_loss: 0.1071 - val_acc: 0.9680
Epoch 81/250
48000/48000 [==============================] - 3s - loss: 0.1435 - acc: 0.9577 - val_loss: 0.1065 - val_acc: 0.9682
Epoch 82/250
48000/48000 [==============================] - 3s - loss: 0.1400 - acc: 0.9589 - val_loss: 0.1061 - val_acc: 0.9684
Epoch 83/250
48000/48000 [==============================] - 3s - loss: 0.1370 - acc: 0.9597 - val_loss: 0.1052 - val_acc: 0.9689
Epoch 84/250
48000/48000 [==============================] - 3s - loss: 0.1362 - acc: 0.9604 - val_loss: 0.1054 - val_acc: 0.9687
Epoch 85/250
48000/48000 [==============================] - 3s - loss: 0.1375 - acc: 0.9596 - val_loss: 0.1048 - val_acc: 0.9687
Epoch 86/250
48000/48000 [==============================] - 3s - loss: 0.1369 - acc: 0.9597 - val_loss: 0.1044 - val_acc: 0.9686
Epoch 87/250
48000/48000 [==============================] - 3s - loss: 0.1342 - acc: 0.9600 - val_loss: 0.1040 - val_acc: 0.9687
Epoch 88/250
48000/48000 [==============================] - 3s - loss: 0.1345 - acc: 0.9595 - val_loss: 0.1034 - val_acc: 0.9689
Epoch 89/250
48000/48000 [==============================] - 3s - loss: 0.1324 - acc: 0.9608 - val_loss: 0.1025 - val_acc: 0.9695
Epoch 90/250
48000/48000 [==============================] - 3s - loss: 0.1324 - acc: 0.9597 - val_loss: 0.1021 - val_acc: 0.9693
Epoch 91/250
48000/48000 [==============================] - 3s - loss: 0.1343 - acc: 0.9605 - val_loss: 0.1019 - val_acc: 0.9696
Epoch 92/250
48000/48000 [==============================] - 3s - loss: 0.1294 - acc: 0.9611 - val_loss: 0.1016 - val_acc: 0.9689
Epoch 93/250
48000/48000 [==============================] - 3s - loss: 0.1276 - acc: 0.9622 - val_loss: 0.1016 - val_acc: 0.9696
Epoch 94/250
48000/48000 [==============================] - 3s - loss: 0.1273 - acc: 0.9630 - val_loss: 0.1005 - val_acc: 0.9705
Epoch 95/250
48000/48000 [==============================] - 3s - loss: 0.1271 - acc: 0.9626 - val_loss: 0.1009 - val_acc: 0.9699
Epoch 96/250
48000/48000 [==============================] - 3s - loss: 0.1251 - acc: 0.9628 - val_loss: 0.1004 - val_acc: 0.9699
Epoch 97/250
48000/48000 [==============================] - 3s - loss: 0.1247 - acc: 0.9631 - val_loss: 0.0997 - val_acc: 0.9706
Epoch 98/250
48000/48000 [==============================] - 3s - loss: 0.1247 - acc: 0.9634 - val_loss: 0.0992 - val_acc: 0.9705
Epoch 99/250
48000/48000 [==============================] - 3s - loss: 0.1264 - acc: 0.9627 - val_loss: 0.0993 - val_acc: 0.9709
Epoch 100/250
48000/48000 [==============================] - 3s - loss: 0.1207 - acc: 0.9639 - val_loss: 0.0982 - val_acc: 0.9707
Epoch 101/250
48000/48000 [==============================] - 3s - loss: 0.1218 - acc: 0.9642 - val_loss: 0.0977 - val_acc: 0.9718
Epoch 102/250
48000/48000 [==============================] - 3s - loss: 0.1213 - acc: 0.9629 - val_loss: 0.0977 - val_acc: 0.9710
Epoch 103/250
48000/48000 [==============================] - 3s - loss: 0.1206 - acc: 0.9637 - val_loss: 0.0966 - val_acc: 0.9710
Epoch 104/250
48000/48000 [==============================] - 3s - loss: 0.1196 - acc: 0.9647 - val_loss: 0.0965 - val_acc: 0.9716
Epoch 105/250
48000/48000 [==============================] - 3s - loss: 0.1198 - acc: 0.9647 - val_loss: 0.0962 - val_acc: 0.9722
Epoch 106/250
48000/48000 [==============================] - 3s - loss: 0.1176 - acc: 0.9656 - val_loss: 0.0960 - val_acc: 0.9707
Epoch 107/250
48000/48000 [==============================] - 3s - loss: 0.1166 - acc: 0.9653 - val_loss: 0.0951 - val_acc: 0.9723
Epoch 108/250
48000/48000 [==============================] - 3s - loss: 0.1177 - acc: 0.9652 - val_loss: 0.0951 - val_acc: 0.9717
Epoch 109/250
48000/48000 [==============================] - 3s - loss: 0.1149 - acc: 0.9651 - val_loss: 0.0949 - val_acc: 0.9717
Epoch 110/250
48000/48000 [==============================] - 3s - loss: 0.1137 - acc: 0.9663 - val_loss: 0.0948 - val_acc: 0.9717
Epoch 111/250
48000/48000 [==============================] - 3s - loss: 0.1166 - acc: 0.9657 - val_loss: 0.0948 - val_acc: 0.9724
Epoch 112/250
48000/48000 [==============================] - 3s - loss: 0.1151 - acc: 0.9651 - val_loss: 0.0941 - val_acc: 0.9717
Epoch 113/250
48000/48000 [==============================] - 3s - loss: 0.1150 - acc: 0.9670 - val_loss: 0.0938 - val_acc: 0.9720
Epoch 114/250
48000/48000 [==============================] - 3s - loss: 0.1130 - acc: 0.9660 - val_loss: 0.0943 - val_acc: 0.9724
Epoch 115/250
48000/48000 [==============================] - 3s - loss: 0.1131 - acc: 0.9663 - val_loss: 0.0938 - val_acc: 0.9721
Epoch 116/250
48000/48000 [==============================] - 3s - loss: 0.1123 - acc: 0.9664 - val_loss: 0.0933 - val_acc: 0.9721
Epoch 117/250
48000/48000 [==============================] - 3s - loss: 0.1120 - acc: 0.9664 - val_loss: 0.0931 - val_acc: 0.9719
Epoch 118/250
48000/48000 [==============================] - 3s - loss: 0.1119 - acc: 0.9668 - val_loss: 0.0927 - val_acc: 0.9718
Epoch 119/250
48000/48000 [==============================] - 3s - loss: 0.1104 - acc: 0.9673 - val_loss: 0.0930 - val_acc: 0.9717
Epoch 120/250
48000/48000 [==============================] - 3s - loss: 0.1054 - acc: 0.9685 - val_loss: 0.0923 - val_acc: 0.9722
Epoch 121/250
48000/48000 [==============================] - 3s - loss: 0.1102 - acc: 0.9665 - val_loss: 0.0917 - val_acc: 0.9728
Epoch 122/250
48000/48000 [==============================] - 3s - loss: 0.1067 - acc: 0.9684 - val_loss: 0.0914 - val_acc: 0.9727
Epoch 123/250
48000/48000 [==============================] - 3s - loss: 0.1101 - acc: 0.9670 - val_loss: 0.0916 - val_acc: 0.9728
Epoch 124/250
48000/48000 [==============================] - 3s - loss: 0.1079 - acc: 0.9677 - val_loss: 0.0913 - val_acc: 0.9730
Epoch 125/250
48000/48000 [==============================] - 3s - loss: 0.1062 - acc: 0.9691 - val_loss: 0.0908 - val_acc: 0.9730
Epoch 126/250
48000/48000 [==============================] - 3s - loss: 0.1036 - acc: 0.9692 - val_loss: 0.0910 - val_acc: 0.9733
Epoch 127/250
48000/48000 [==============================] - 3s - loss: 0.1050 - acc: 0.9693 - val_loss: 0.0901 - val_acc: 0.9727
Epoch 128/250
48000/48000 [==============================] - 3s - loss: 0.1022 - acc: 0.9692 - val_loss: 0.0907 - val_acc: 0.9729
Epoch 129/250
48000/48000 [==============================] - 2s - loss: 0.1051 - acc: 0.9688 - val_loss: 0.0899 - val_acc: 0.9735
Epoch 130/250
48000/48000 [==============================] - 2s - loss: 0.1023 - acc: 0.9696 - val_loss: 0.0904 - val_acc: 0.9735
Epoch 131/250
48000/48000 [==============================] - 2s - loss: 0.1030 - acc: 0.9686 - val_loss: 0.0896 - val_acc: 0.9734
Epoch 132/250
48000/48000 [==============================] - 2s - loss: 0.1032 - acc: 0.9681 - val_loss: 0.0892 - val_acc: 0.9736
Epoch 133/250
48000/48000 [==============================] - 2s - loss: 0.1010 - acc: 0.9697 - val_loss: 0.0897 - val_acc: 0.9731
Epoch 134/250
48000/48000 [==============================] - 2s - loss: 0.0990 - acc: 0.9705 - val_loss: 0.0891 - val_acc: 0.9732
Epoch 135/250
48000/48000 [==============================] - 2s - loss: 0.0997 - acc: 0.9699 - val_loss: 0.0889 - val_acc: 0.9737
Epoch 136/250
48000/48000 [==============================] - 2s - loss: 0.0989 - acc: 0.9694 - val_loss: 0.0885 - val_acc: 0.9742
Epoch 137/250
48000/48000 [==============================] - 2s - loss: 0.1011 - acc: 0.9691 - val_loss: 0.0882 - val_acc: 0.9733
Epoch 138/250
48000/48000 [==============================] - 2s - loss: 0.0996 - acc: 0.9700 - val_loss: 0.0885 - val_acc: 0.9739
Epoch 139/250
48000/48000 [==============================] - 3s - loss: 0.0986 - acc: 0.9709 - val_loss: 0.0882 - val_acc: 0.9737
Epoch 140/250
48000/48000 [==============================] - 3s - loss: 0.0990 - acc: 0.9701 - val_loss: 0.0888 - val_acc: 0.9736
Epoch 141/250
48000/48000 [==============================] - 3s - loss: 0.0983 - acc: 0.9701 - val_loss: 0.0874 - val_acc: 0.9741
Epoch 142/250
48000/48000 [==============================] - 3s - loss: 0.0958 - acc: 0.9710 - val_loss: 0.0878 - val_acc: 0.9738
Epoch 143/250
48000/48000 [==============================] - 2s - loss: 0.0967 - acc: 0.9707 - val_loss: 0.0881 - val_acc: 0.9744
Epoch 144/250
48000/48000 [==============================] - 2s - loss: 0.0957 - acc: 0.9706 - val_loss: 0.0875 - val_acc: 0.9742
Epoch 145/250
48000/48000 [==============================] - 2s - loss: 0.0953 - acc: 0.9709 - val_loss: 0.0870 - val_acc: 0.9746
Epoch 146/250
48000/48000 [==============================] - 3s - loss: 0.0951 - acc: 0.9706 - val_loss: 0.0866 - val_acc: 0.9743
Epoch 147/250
48000/48000 [==============================] - 3s - loss: 0.0959 - acc: 0.9711 - val_loss: 0.0868 - val_acc: 0.9742
Epoch 148/250
48000/48000 [==============================] - 3s - loss: 0.0941 - acc: 0.9713 - val_loss: 0.0865 - val_acc: 0.9742
Epoch 149/250
48000/48000 [==============================] - 3s - loss: 0.0933 - acc: 0.9717 - val_loss: 0.0864 - val_acc: 0.9742
Epoch 150/250
48000/48000 [==============================] - 2s - loss: 0.0933 - acc: 0.9713 - val_loss: 0.0863 - val_acc: 0.9746
Epoch 151/250
48000/48000 [==============================] - 2s - loss: 0.0925 - acc: 0.9712 - val_loss: 0.0858 - val_acc: 0.9747
Epoch 152/250
48000/48000 [==============================] - 3s - loss: 0.0930 - acc: 0.9721 - val_loss: 0.0862 - val_acc: 0.9747
Epoch 153/250
48000/48000 [==============================] - 3s - loss: 0.0896 - acc: 0.9732 - val_loss: 0.0859 - val_acc: 0.9755
Epoch 154/250
48000/48000 [==============================] - 3s - loss: 0.0894 - acc: 0.9733 - val_loss: 0.0854 - val_acc: 0.9750
Epoch 155/250
48000/48000 [==============================] - 3s - loss: 0.0908 - acc: 0.9718 - val_loss: 0.0858 - val_acc: 0.9747
Epoch 156/250
48000/48000 [==============================] - 3s - loss: 0.0920 - acc: 0.9726 - val_loss: 0.0854 - val_acc: 0.9748
Epoch 157/250
48000/48000 [==============================] - 3s - loss: 0.0923 - acc: 0.9711 - val_loss: 0.0856 - val_acc: 0.9747
Epoch 158/250
48000/48000 [==============================] - 3s - loss: 0.0905 - acc: 0.9728 - val_loss: 0.0857 - val_acc: 0.9751
Epoch 159/250
48000/48000 [==============================] - 2s - loss: 0.0876 - acc: 0.9741 - val_loss: 0.0848 - val_acc: 0.9754
Epoch 160/250
48000/48000 [==============================] - 2s - loss: 0.0869 - acc: 0.9739 - val_loss: 0.0851 - val_acc: 0.9755
Epoch 161/250
48000/48000 [==============================] - 3s - loss: 0.0890 - acc: 0.9732 - val_loss: 0.0850 - val_acc: 0.9751
Epoch 162/250
48000/48000 [==============================] - 3s - loss: 0.0904 - acc: 0.9720 - val_loss: 0.0850 - val_acc: 0.9751
Epoch 163/250
48000/48000 [==============================] - 3s - loss: 0.0868 - acc: 0.9738 - val_loss: 0.0849 - val_acc: 0.9750
Epoch 164/250
48000/48000 [==============================] - 3s - loss: 0.0869 - acc: 0.9744 - val_loss: 0.0842 - val_acc: 0.9757
Epoch 165/250
48000/48000 [==============================] - 3s - loss: 0.0890 - acc: 0.9723 - val_loss: 0.0838 - val_acc: 0.9760
Epoch 166/250
48000/48000 [==============================] - 3s - loss: 0.0865 - acc: 0.9727 - val_loss: 0.0840 - val_acc: 0.9757
Epoch 167/250
48000/48000 [==============================] - 3s - loss: 0.0873 - acc: 0.9734 - val_loss: 0.0851 - val_acc: 0.9749
Epoch 168/250
48000/48000 [==============================] - 3s - loss: 0.0863 - acc: 0.9737 - val_loss: 0.0837 - val_acc: 0.9754
Epoch 169/250
48000/48000 [==============================] - 3s - loss: 0.0855 - acc: 0.9741 - val_loss: 0.0839 - val_acc: 0.9755
Epoch 170/250
48000/48000 [==============================] - 3s - loss: 0.0838 - acc: 0.9747 - val_loss: 0.0837 - val_acc: 0.9757
Epoch 171/250
48000/48000 [==============================] - 3s - loss: 0.0847 - acc: 0.9751 - val_loss: 0.0835 - val_acc: 0.9755
Epoch 172/250
48000/48000 [==============================] - 2s - loss: 0.0842 - acc: 0.9747 - val_loss: 0.0831 - val_acc: 0.9760
Epoch 173/250
48000/48000 [==============================] - 3s - loss: 0.0843 - acc: 0.9744 - val_loss: 0.0834 - val_acc: 0.97520.
Epoch 174/250
48000/48000 [==============================] - 3s - loss: 0.0836 - acc: 0.9740 - val_loss: 0.0827 - val_acc: 0.9751
Epoch 175/250
48000/48000 [==============================] - 3s - loss: 0.0824 - acc: 0.9749 - val_loss: 0.0830 - val_acc: 0.9759
Epoch 176/250
48000/48000 [==============================] - 3s - loss: 0.0849 - acc: 0.9739 - val_loss: 0.0833 - val_acc: 0.9757
Epoch 177/250
48000/48000 [==============================] - 3s - loss: 0.0838 - acc: 0.9744 - val_loss: 0.0835 - val_acc: 0.9758
Epoch 178/250
48000/48000 [==============================] - 3s - loss: 0.0810 - acc: 0.9754 - val_loss: 0.0837 - val_acc: 0.9753
Epoch 179/250
48000/48000 [==============================] - 3s - loss: 0.0833 - acc: 0.9744 - val_loss: 0.0826 - val_acc: 0.9757
Epoch 180/250
48000/48000 [==============================] - 3s - loss: 0.0800 - acc: 0.9755 - val_loss: 0.0826 - val_acc: 0.9759
Epoch 181/250
48000/48000 [==============================] - 3s - loss: 0.0825 - acc: 0.9747 - val_loss: 0.0830 - val_acc: 0.9761
Epoch 182/250
48000/48000 [==============================] - 3s - loss: 0.0800 - acc: 0.9754 - val_loss: 0.0828 - val_acc: 0.9760
Epoch 183/250
48000/48000 [==============================] - 3s - loss: 0.0779 - acc: 0.9760 - val_loss: 0.0822 - val_acc: 0.9762
Epoch 184/250
48000/48000 [==============================] - 3s - loss: 0.0804 - acc: 0.9753 - val_loss: 0.0823 - val_acc: 0.9758
Epoch 185/250
48000/48000 [==============================] - 3s - loss: 0.0816 - acc: 0.9752 - val_loss: 0.0822 - val_acc: 0.9759
Epoch 186/250
48000/48000 [==============================] - 3s - loss: 0.0782 - acc: 0.9769 - val_loss: 0.0824 - val_acc: 0.9762
Epoch 187/250
48000/48000 [==============================] - 3s - loss: 0.0789 - acc: 0.9765 - val_loss: 0.0818 - val_acc: 0.9760
Epoch 188/250
48000/48000 [==============================] - 3s - loss: 0.0778 - acc: 0.9761 - val_loss: 0.0822 - val_acc: 0.9756
Epoch 189/250
48000/48000 [==============================] - 3s - loss: 0.0788 - acc: 0.9749 - val_loss: 0.0820 - val_acc: 0.9762
Epoch 190/250
48000/48000 [==============================] - 3s - loss: 0.0782 - acc: 0.9768 - val_loss: 0.0822 - val_acc: 0.9761
Epoch 191/250
48000/48000 [==============================] - 3s - loss: 0.0778 - acc: 0.9765 - val_loss: 0.0819 - val_acc: 0.9757
Epoch 192/250
48000/48000 [==============================] - 3s - loss: 0.0781 - acc: 0.9763 - val_loss: 0.0816 - val_acc: 0.9763
Epoch 193/250
48000/48000 [==============================] - 3s - loss: 0.0777 - acc: 0.9760 - val_loss: 0.0820 - val_acc: 0.9758
Epoch 194/250
48000/48000 [==============================] - 2s - loss: 0.0772 - acc: 0.9770 - val_loss: 0.0814 - val_acc: 0.9762
Epoch 195/250
48000/48000 [==============================] - 2s - loss: 0.0756 - acc: 0.9775 - val_loss: 0.0812 - val_acc: 0.9758
Epoch 196/250
48000/48000 [==============================] - 2s - loss: 0.0765 - acc: 0.9778 - val_loss: 0.0818 - val_acc: 0.9764
Epoch 197/250
48000/48000 [==============================] - 2s - loss: 0.0778 - acc: 0.9762 - val_loss: 0.0814 - val_acc: 0.9762
Epoch 198/250
48000/48000 [==============================] - 2s - loss: 0.0761 - acc: 0.9767 - val_loss: 0.0810 - val_acc: 0.9767
Epoch 199/250
48000/48000 [==============================] - 2s - loss: 0.0756 - acc: 0.9765 - val_loss: 0.0809 - val_acc: 0.9767
Epoch 200/250
48000/48000 [==============================] - 2s - loss: 0.0762 - acc: 0.9767 - val_loss: 0.0813 - val_acc: 0.9766
Epoch 201/250
48000/48000 [==============================] - 2s - loss: 0.0763 - acc: 0.9763 - val_loss: 0.0808 - val_acc: 0.9767
Epoch 202/250
48000/48000 [==============================] - 2s - loss: 0.0729 - acc: 0.9775 - val_loss: 0.0814 - val_acc: 0.9762
Epoch 203/250
48000/48000 [==============================] - 2s - loss: 0.0750 - acc: 0.9773 - val_loss: 0.0813 - val_acc: 0.9765
Epoch 204/250
48000/48000 [==============================] - 2s - loss: 0.0720 - acc: 0.9771 - val_loss: 0.0815 - val_acc: 0.9768
Epoch 205/250
48000/48000 [==============================] - 2s - loss: 0.0730 - acc: 0.9778 - val_loss: 0.0810 - val_acc: 0.9767
Epoch 206/250
48000/48000 [==============================] - 2s - loss: 0.0733 - acc: 0.9777 - val_loss: 0.0811 - val_acc: 0.9764
Epoch 207/250
48000/48000 [==============================] - 2s - loss: 0.0730 - acc: 0.9773 - val_loss: 0.0815 - val_acc: 0.9766
Epoch 208/250
48000/48000 [==============================] - 2s - loss: 0.0738 - acc: 0.9765 - val_loss: 0.0813 - val_acc: 0.9762
Epoch 209/250
48000/48000 [==============================] - 2s - loss: 0.0719 - acc: 0.9778 - val_loss: 0.0810 - val_acc: 0.9767
Epoch 210/250
48000/48000 [==============================] - 2s - loss: 0.0734 - acc: 0.9777 - val_loss: 0.0806 - val_acc: 0.9767
Epoch 211/250
48000/48000 [==============================] - 2s - loss: 0.0733 - acc: 0.9772 - val_loss: 0.0808 - val_acc: 0.9762
Epoch 212/250
48000/48000 [==============================] - 2s - loss: 0.0711 - acc: 0.9779 - val_loss: 0.0807 - val_acc: 0.9766
Epoch 213/250
48000/48000 [==============================] - 2s - loss: 0.0728 - acc: 0.9777 - val_loss: 0.0808 - val_acc: 0.9769
Epoch 214/250
48000/48000 [==============================] - 2s - loss: 0.0705 - acc: 0.9786 - val_loss: 0.0810 - val_acc: 0.9768
Epoch 215/250
48000/48000 [==============================] - 2s - loss: 0.0712 - acc: 0.9782 - val_loss: 0.0810 - val_acc: 0.9767
Epoch 216/250
48000/48000 [==============================] - 2s - loss: 0.0691 - acc: 0.9785 - val_loss: 0.0804 - val_acc: 0.9768
Epoch 217/250
48000/48000 [==============================] - 2s - loss: 0.0706 - acc: 0.9781 - val_loss: 0.0806 - val_acc: 0.9772
Epoch 218/250
48000/48000 [==============================] - 2s - loss: 0.0685 - acc: 0.9787 - val_loss: 0.0803 - val_acc: 0.9768
Epoch 219/250
48000/48000 [==============================] - 2s - loss: 0.0708 - acc: 0.9783 - val_loss: 0.0801 - val_acc: 0.9769
Epoch 220/250
48000/48000 [==============================] - 2s - loss: 0.0695 - acc: 0.9783 - val_loss: 0.0802 - val_acc: 0.9769
Epoch 221/250
48000/48000 [==============================] - 2s - loss: 0.0693 - acc: 0.9783 - val_loss: 0.0797 - val_acc: 0.9771
Epoch 222/250
48000/48000 [==============================] - 2s - loss: 0.0694 - acc: 0.9779 - val_loss: 0.0798 - val_acc: 0.9772
Epoch 223/250
48000/48000 [==============================] - 2s - loss: 0.0666 - acc: 0.9791 - val_loss: 0.0802 - val_acc: 0.9769
Epoch 224/250
48000/48000 [==============================] - 2s - loss: 0.0681 - acc: 0.9791 - val_loss: 0.0795 - val_acc: 0.9774
Epoch 225/250
48000/48000 [==============================] - 3s - loss: 0.0680 - acc: 0.9790 - val_loss: 0.0794 - val_acc: 0.9776
Epoch 226/250
48000/48000 [==============================] - 2s - loss: 0.0681 - acc: 0.9791 - val_loss: 0.0796 - val_acc: 0.9773
Epoch 227/250
48000/48000 [==============================] - 2s - loss: 0.0675 - acc: 0.9788 - val_loss: 0.0795 - val_acc: 0.9773
Epoch 228/250
48000/48000 [==============================] - 2s - loss: 0.0676 - acc: 0.9792 - val_loss: 0.0795 - val_acc: 0.9778
Epoch 229/250
48000/48000 [==============================] - 2s - loss: 0.0698 - acc: 0.9780 - val_loss: 0.0790 - val_acc: 0.9772
Epoch 230/250
48000/48000 [==============================] - 2s - loss: 0.0681 - acc: 0.9793 - val_loss: 0.0791 - val_acc: 0.9775
Epoch 231/250
48000/48000 [==============================] - 2s - loss: 0.0684 - acc: 0.9784 - val_loss: 0.0790 - val_acc: 0.9770
Epoch 232/250
48000/48000 [==============================] - 2s - loss: 0.0671 - acc: 0.9788 - val_loss: 0.0792 - val_acc: 0.9773
Epoch 233/250
48000/48000 [==============================] - 2s - loss: 0.0663 - acc: 0.9792 - val_loss: 0.0791 - val_acc: 0.9768
Epoch 234/250
48000/48000 [==============================] - 2s - loss: 0.0666 - acc: 0.9794 - val_loss: 0.0790 - val_acc: 0.9771
Epoch 235/250
48000/48000 [==============================] - 2s - loss: 0.0659 - acc: 0.9797 - val_loss: 0.0790 - val_acc: 0.9771
Epoch 236/250
48000/48000 [==============================] - 2s - loss: 0.0662 - acc: 0.9796 - val_loss: 0.0784 - val_acc: 0.9772
Epoch 237/250
48000/48000 [==============================] - 2s - loss: 0.0641 - acc: 0.9806 - val_loss: 0.0788 - val_acc: 0.9776
Epoch 238/250
48000/48000 [==============================] - 2s - loss: 0.0662 - acc: 0.9790 - val_loss: 0.0793 - val_acc: 0.9763
Epoch 239/250
48000/48000 [==============================] - 2s - loss: 0.0657 - acc: 0.9793 - val_loss: 0.0792 - val_acc: 0.9769
Epoch 240/250
48000/48000 [==============================] - 2s - loss: 0.0642 - acc: 0.9808 - val_loss: 0.0791 - val_acc: 0.9772
Epoch 241/250
48000/48000 [==============================] - 2s - loss: 0.0652 - acc: 0.9795 - val_loss: 0.0794 - val_acc: 0.9774
Epoch 242/250
48000/48000 [==============================] - 2s - loss: 0.0655 - acc: 0.9801 - val_loss: 0.0789 - val_acc: 0.9777
Epoch 243/250
48000/48000 [==============================] - 2s - loss: 0.0623 - acc: 0.9809 - val_loss: 0.0791 - val_acc: 0.9778
Epoch 244/250
48000/48000 [==============================] - 2s - loss: 0.0645 - acc: 0.9800 - val_loss: 0.0792 - val_acc: 0.9770
Epoch 245/250
48000/48000 [==============================] - 2s - loss: 0.0645 - acc: 0.9797 - val_loss: 0.0795 - val_acc: 0.9769
Epoch 246/250
48000/48000 [==============================] - 2s - loss: 0.0634 - acc: 0.9808 - val_loss: 0.0793 - val_acc: 0.9777
Epoch 247/250
48000/48000 [==============================] - 2s - loss: 0.0633 - acc: 0.9802 - val_loss: 0.0795 - val_acc: 0.9775
Epoch 248/250
48000/48000 [==============================] - 2s - loss: 0.0641 - acc: 0.9806 - val_loss: 0.0787 - val_acc: 0.9772
Epoch 249/250
48000/48000 [==============================] - 2s - loss: 0.0641 - acc: 0.9796 - val_loss: 0.0790 - val_acc: 0.9773
Epoch 250/250
48000/48000 [==============================] - 2s - loss: 0.0635 - acc: 0.9803 - val_loss: 0.0789 - val_acc: 0.9778

Step 9: Evaluate the model on the test dataset (10,000 images)

score = model.evaluate(X_test, Y_test, verbose=VERBOSE)
print("\nTest score:", score[0])
print('Test accuracy:', score[1])
 9504/10000 [===========================>..] - ETA: 0s
Test score: 0.0760220484917
Test accuracy: 0.9782

Step 10: Plot the accuracy from history

print(history.history.keys())
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
dict_keys(['val_loss', 'val_acc', 'loss', 'acc'])

_config.yml

Step 11: Plot the loss from history

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()

_config.yml

[Optional] Step 12: Save the model (serialized) to JSON

model_json = model.to_json()
with open("model.json", "w") as json_file:
    json_file.write(model_json)
%ls
 Volume in drive C is Windows
 Volume Serial Number is 7252-C405

 Directory of C:\Users\cobalt\workspace

09/17/2017  06:43 PM    <DIR>          .
09/17/2017  06:43 PM    <DIR>          ..
09/17/2017  06:07 PM    <DIR>          .ipynb_checkpoints
01/07/2017  12:22 PM    <DIR>          .metadata
09/17/2017  01:16 PM            50,073 DeepLearningHelloWorld.ipynb
09/17/2017  04:32 PM            56,578 DeepLearningHelloWorldPart2.ipynb
09/17/2017  06:43 PM           110,179 DeepLearningHelloWorldPart3.ipynb
01/08/2017  10:52 AM    <DIR>          Hello
01/08/2017  10:52 AM    <DIR>          Hellocpp11
01/09/2017  04:45 PM    <DIR>          HelloOpenCV
09/17/2017  04:31 PM           490,640 model.h5
09/17/2017  06:44 PM             2,059 model.json
01/07/2017  12:22 PM    <DIR>          RemoteSystemsTempFiles
               5 File(s)        709,529 bytes
               8 Dir(s)  199,080,792,064 bytes free

[Optional] Step 13: Save the model weights

model.save_weights("model.h5")
%ls
 Volume in drive C is Windows
 Volume Serial Number is 7252-C405

 Directory of C:\Users\cobalt\workspace

09/17/2017  06:43 PM    <DIR>          .
09/17/2017  06:43 PM    <DIR>          ..
09/17/2017  06:07 PM    <DIR>          .ipynb_checkpoints
01/07/2017  12:22 PM    <DIR>          .metadata
09/17/2017  01:16 PM            50,073 DeepLearningHelloWorld.ipynb
09/17/2017  04:32 PM            56,578 DeepLearningHelloWorldPart2.ipynb
09/17/2017  06:43 PM           110,179 DeepLearningHelloWorldPart3.ipynb
01/08/2017  10:52 AM    <DIR>          Hello
01/08/2017  10:52 AM    <DIR>          Hellocpp11
01/09/2017  04:45 PM    <DIR>          HelloOpenCV
09/17/2017  06:44 PM           492,240 model.h5
09/17/2017  06:44 PM             2,059 model.json
01/07/2017  12:22 PM    <DIR>          RemoteSystemsTempFiles
               5 File(s)        711,129 bytes
               8 Dir(s)  199,080,787,968 bytes free

[Optional] Step 14: Load the saved model

json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
loaded_model.load_weights("model.h5")

[Optional] Step 15: Compile and evaluate loaded model

loaded_model.compile(loss='categorical_crossentropy',
              optimizer=OPTIMIZER,
              metrics=['accuracy'])
score = loaded_model.evaluate(X_test, Y_test, verbose=VERBOSE)
print("\nTest score:", score[0])
print('Test accuracy:', score[1])
 9376/10000 [===========================>..] - ETA: 0s
Test score: 0.0760220484917
Test accuracy: 0.9782
  • mkc
Written on September 17, 2017