Deep Learning Hello World In Keras Part4
Assignment 1-d: Deep Learning Hello World! (3-layer MLP+Dropout+Optimizer)
Objective: To be able to improve upon the 3-layer MLP + Dropout in part 3 by adding different optimization for MNIST Classification
Step 1: Taking care of the imports which includes numpy, datasets, models, layers, optimizers, and utils.
You will also be able to tell if your set-up is correct/complete.
from __future__ import print_function
import numpy as np
from keras.datasets import mnist
from keras.models import Sequential
from keras.models import model_from_json
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import Adam
from keras.utils import np_utils
from matplotlib import pyplot as plt
%matplotlib inline
Using TensorFlow backend.
Step 2: Set-up some constants to be utilized in the training/testing of the model <br> Note: the number of epochs (NB_EPOCH) is increased to 250
NB_EPOCH = 250
BATCH_SIZE = 128
VERBOSE = 1
NB_CLASSES = 10 # number of outputs = number of digits, i.e. 0,1,2,3,4,5,6,7,8,9
OPTIMIZER = Adam()
N_HIDDEN = 128
VALIDATION_SPLIT=0.2 # how much TRAIN dataset is reserved for VALIDATION
DROPOUT = 0.3
np.random.seed(1983) # for reproducibility
Step 3: Load the MNIST Dataset which are shuffled and split between train and test sets <br>
- X_train is 60000 rows of 28x28 values
- X_test is 10000 rows of 28x28 values
(X_train, y_train), (X_test, y_test) = mnist.load_data()
print("First 100 train images:")
for k in range(100):
plt.subplot(10, 10, k+1)
plt.gca().axes.get_yaxis().set_visible(False)
plt.gca().axes.get_xaxis().set_visible(False)
plt.imshow(X_train[k])
First 100 train images:
Step 4: Preprocess the input data by reshaping it, converting it to float, and normalizing it [0-1].
# reshape
X_train = X_train.reshape(60000, 784)
X_test = X_test.reshape(10000, 784)
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
# normalize
X_train /= 255
X_test /= 255
print(X_train.shape, 'train samples')
print(X_test.shape, 'test samples')
(60000, 784) train samples
(10000, 784) test samples
Step 5: Convert class vectors to binary class matrices; One-Hot-Encoding (OHE)
Y_train = np_utils.to_categorical(y_train, NB_CLASSES)
Y_test = np_utils.to_categorical(y_test, NB_CLASSES)
Step 6: Create the model with 3 layers: Input:784 ==> Hidden:128 w/ dropout ==> Hidden:128 w/ dropout ==> Output:10 (with Softmax activation)
model = Sequential()
model.add(Dense(N_HIDDEN, input_shape=(784,)))
model.add(Activation('relu'))
model.add(Dropout(DROPOUT))
model.add(Dense(N_HIDDEN))
model.add(Activation('relu'))
model.add(Dropout(DROPOUT))
model.add(Dense(NB_CLASSES))
model.add(Activation('softmax'))
model.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_1 (Dense) (None, 128) 100480
_________________________________________________________________
activation_1 (Activation) (None, 128) 0
_________________________________________________________________
dropout_1 (Dropout) (None, 128) 0
_________________________________________________________________
dense_2 (Dense) (None, 128) 16512
_________________________________________________________________
activation_2 (Activation) (None, 128) 0
_________________________________________________________________
dropout_2 (Dropout) (None, 128) 0
_________________________________________________________________
dense_3 (Dense) (None, 10) 1290
_________________________________________________________________
activation_3 (Activation) (None, 10) 0
=================================================================
Total params: 118,282
Trainable params: 118,282
Non-trainable params: 0
_________________________________________________________________
Step 7: Compile the model with categorical_crossentropy loss function, SGD optimizer, and accuracy metric
model.compile(loss='categorical_crossentropy',
optimizer=OPTIMIZER,
metrics=['accuracy'])
Step 8: Perform the training with 128 batch size, 250 epochs, and 20 % of the train data used for validation
history = model.fit(X_train, Y_train,
batch_size=BATCH_SIZE, epochs=NB_EPOCH,
verbose=VERBOSE, validation_split=VALIDATION_SPLIT)
Train on 48000 samples, validate on 12000 samples
Epoch 1/250
48000/48000 [==============================] - 3s - loss: 0.5104 - acc: 0.8449 - val_loss: 0.1830 - val_acc: 0.9469
Epoch 2/250
48000/48000 [==============================] - 3s - loss: 0.2360 - acc: 0.9299 - val_loss: 0.1370 - val_acc: 0.9599
Epoch 3/250
48000/48000 [==============================] - 3s - loss: 0.1813 - acc: 0.9462 - val_loss: 0.1222 - val_acc: 0.9644
Epoch 4/250
48000/48000 [==============================] - 3s - loss: 0.1536 - acc: 0.9539 - val_loss: 0.1060 - val_acc: 0.9685
Epoch 5/250
48000/48000 [==============================] - 3s - loss: 0.1289 - acc: 0.9605 - val_loss: 0.0978 - val_acc: 0.9706
Epoch 6/250
48000/48000 [==============================] - 3s - loss: 0.1160 - acc: 0.9646 - val_loss: 0.0967 - val_acc: 0.9727
Epoch 7/250
48000/48000 [==============================] - 3s - loss: 0.1093 - acc: 0.9667 - val_loss: 0.0916 - val_acc: 0.9738
Epoch 8/250
48000/48000 [==============================] - 3s - loss: 0.0983 - acc: 0.9694 - val_loss: 0.0895 - val_acc: 0.9744
Epoch 9/250
48000/48000 [==============================] - 3s - loss: 0.0874 - acc: 0.9730 - val_loss: 0.0861 - val_acc: 0.9758
Epoch 10/250
48000/48000 [==============================] - 3s - loss: 0.0849 - acc: 0.9732 - val_loss: 0.0885 - val_acc: 0.9749
Epoch 11/250
48000/48000 [==============================] - 3s - loss: 0.0792 - acc: 0.9747 - val_loss: 0.0878 - val_acc: 0.9763
Epoch 12/250
48000/48000 [==============================] - 3s - loss: 0.0738 - acc: 0.9774 - val_loss: 0.0807 - val_acc: 0.9762
Epoch 13/250
48000/48000 [==============================] - 3s - loss: 0.0718 - acc: 0.9778 - val_loss: 0.0830 - val_acc: 0.9775
Epoch 14/250
48000/48000 [==============================] - 3s - loss: 0.0665 - acc: 0.9793 - val_loss: 0.0835 - val_acc: 0.9768
Epoch 15/250
48000/48000 [==============================] - 3s - loss: 0.0635 - acc: 0.9804 - val_loss: 0.0811 - val_acc: 0.9779
Epoch 16/250
48000/48000 [==============================] - 3s - loss: 0.0586 - acc: 0.9805 - val_loss: 0.0798 - val_acc: 0.9782
Epoch 17/250
48000/48000 [==============================] - 3s - loss: 0.0618 - acc: 0.9802 - val_loss: 0.0865 - val_acc: 0.9767
Epoch 18/250
48000/48000 [==============================] - 3s - loss: 0.0571 - acc: 0.9809 - val_loss: 0.0821 - val_acc: 0.9777
Epoch 19/250
48000/48000 [==============================] - 3s - loss: 0.0555 - acc: 0.9819 - val_loss: 0.0809 - val_acc: 0.9774
Epoch 20/250
48000/48000 [==============================] - 3s - loss: 0.0548 - acc: 0.9821 - val_loss: 0.0898 - val_acc: 0.9779
Epoch 21/250
48000/48000 [==============================] - 3s - loss: 0.0521 - acc: 0.9828 - val_loss: 0.0803 - val_acc: 0.9797
Epoch 22/250
48000/48000 [==============================] - 3s - loss: 0.0495 - acc: 0.9836 - val_loss: 0.0843 - val_acc: 0.9782
Epoch 23/250
48000/48000 [==============================] - 3s - loss: 0.0473 - acc: 0.9844 - val_loss: 0.0836 - val_acc: 0.9790
Epoch 24/250
48000/48000 [==============================] - 3s - loss: 0.0460 - acc: 0.9845 - val_loss: 0.0880 - val_acc: 0.9793
Epoch 25/250
48000/48000 [==============================] - 3s - loss: 0.0434 - acc: 0.9867 - val_loss: 0.0886 - val_acc: 0.9786
Epoch 26/250
48000/48000 [==============================] - 3s - loss: 0.0452 - acc: 0.9849 - val_loss: 0.0858 - val_acc: 0.9787
Epoch 27/250
48000/48000 [==============================] - 3s - loss: 0.0454 - acc: 0.9851 - val_loss: 0.0822 - val_acc: 0.9786
Epoch 28/250
48000/48000 [==============================] - 3s - loss: 0.0443 - acc: 0.9854 - val_loss: 0.0850 - val_acc: 0.9791
Epoch 29/250
48000/48000 [==============================] - 3s - loss: 0.0432 - acc: 0.9857 - val_loss: 0.0855 - val_acc: 0.9798
Epoch 30/250
48000/48000 [==============================] - 3s - loss: 0.0417 - acc: 0.9862 - val_loss: 0.0901 - val_acc: 0.9795
Epoch 31/250
48000/48000 [==============================] - 3s - loss: 0.0400 - acc: 0.9866 - val_loss: 0.0863 - val_acc: 0.9800
Epoch 32/250
48000/48000 [==============================] - 4s - loss: 0.0403 - acc: 0.9867 - val_loss: 0.0846 - val_acc: 0.9791
Epoch 33/250
48000/48000 [==============================] - 4s - loss: 0.0379 - acc: 0.9871 - val_loss: 0.0869 - val_acc: 0.9797
Epoch 34/250
48000/48000 [==============================] - 3s - loss: 0.0383 - acc: 0.9875 - val_loss: 0.0943 - val_acc: 0.9773
Epoch 35/250
48000/48000 [==============================] - 3s - loss: 0.0354 - acc: 0.9883 - val_loss: 0.0877 - val_acc: 0.9795
Epoch 36/250
48000/48000 [==============================] - 3s - loss: 0.0353 - acc: 0.9883 - val_loss: 0.0878 - val_acc: 0.9796
Epoch 37/250
48000/48000 [==============================] - 3s - loss: 0.0379 - acc: 0.9877 - val_loss: 0.0927 - val_acc: 0.9782
Epoch 38/250
48000/48000 [==============================] - 3s - loss: 0.0355 - acc: 0.9882 - val_loss: 0.0925 - val_acc: 0.9789
Epoch 39/250
48000/48000 [==============================] - 3s - loss: 0.0360 - acc: 0.9878 - val_loss: 0.0927 - val_acc: 0.9789
Epoch 40/250
48000/48000 [==============================] - 3s - loss: 0.0357 - acc: 0.9883 - val_loss: 0.0920 - val_acc: 0.9793
Epoch 41/250
48000/48000 [==============================] - 3s - loss: 0.0358 - acc: 0.9880 - val_loss: 0.0875 - val_acc: 0.9805
Epoch 42/250
48000/48000 [==============================] - 3s - loss: 0.0315 - acc: 0.9898 - val_loss: 0.0930 - val_acc: 0.9804
Epoch 43/250
48000/48000 [==============================] - 3s - loss: 0.0335 - acc: 0.9881 - val_loss: 0.0976 - val_acc: 0.9785
Epoch 44/250
48000/48000 [==============================] - 3s - loss: 0.0331 - acc: 0.9886 - val_loss: 0.0927 - val_acc: 0.9800
Epoch 45/250
48000/48000 [==============================] - 3s - loss: 0.0334 - acc: 0.9887 - val_loss: 0.0891 - val_acc: 0.9805
Epoch 46/250
48000/48000 [==============================] - 3s - loss: 0.0311 - acc: 0.9894 - val_loss: 0.0926 - val_acc: 0.9789
Epoch 47/250
48000/48000 [==============================] - 3s - loss: 0.0314 - acc: 0.9893 - val_loss: 0.0955 - val_acc: 0.9792
Epoch 48/250
48000/48000 [==============================] - 3s - loss: 0.0325 - acc: 0.9894 - val_loss: 0.0926 - val_acc: 0.9804
Epoch 49/250
48000/48000 [==============================] - 3s - loss: 0.0305 - acc: 0.9904 - val_loss: 0.1014 - val_acc: 0.9772
Epoch 50/250
48000/48000 [==============================] - 3s - loss: 0.0289 - acc: 0.9900 - val_loss: 0.0937 - val_acc: 0.9792
Epoch 51/250
48000/48000 [==============================] - 3s - loss: 0.0314 - acc: 0.9897 - val_loss: 0.0982 - val_acc: 0.9790
Epoch 52/250
48000/48000 [==============================] - 3s - loss: 0.0292 - acc: 0.9902 - val_loss: 0.0953 - val_acc: 0.9801
Epoch 53/250
48000/48000 [==============================] - 3s - loss: 0.0286 - acc: 0.9902 - val_loss: 0.0954 - val_acc: 0.9800
Epoch 54/250
48000/48000 [==============================] - 4s - loss: 0.0291 - acc: 0.9901 - val_loss: 0.0959 - val_acc: 0.9798
Epoch 55/250
48000/48000 [==============================] - 3s - loss: 0.0301 - acc: 0.9898 - val_loss: 0.0974 - val_acc: 0.9792
Epoch 56/250
48000/48000 [==============================] - 3s - loss: 0.0304 - acc: 0.9898 - val_loss: 0.0980 - val_acc: 0.9792
Epoch 57/250
48000/48000 [==============================] - 3s - loss: 0.0314 - acc: 0.9900 - val_loss: 0.0944 - val_acc: 0.9789
Epoch 58/250
48000/48000 [==============================] - 3s - loss: 0.0274 - acc: 0.9906 - val_loss: 0.0997 - val_acc: 0.9801
Epoch 59/250
48000/48000 [==============================] - 3s - loss: 0.0265 - acc: 0.9909 - val_loss: 0.1058 - val_acc: 0.9789
Epoch 60/250
48000/48000 [==============================] - 3s - loss: 0.0261 - acc: 0.9910 - val_loss: 0.0966 - val_acc: 0.9803
Epoch 61/250
48000/48000 [==============================] - 3s - loss: 0.0266 - acc: 0.9915 - val_loss: 0.1045 - val_acc: 0.9802
Epoch 62/250
48000/48000 [==============================] - 3s - loss: 0.0308 - acc: 0.9904 - val_loss: 0.0934 - val_acc: 0.9794
Epoch 63/250
48000/48000 [==============================] - 3s - loss: 0.0289 - acc: 0.9902 - val_loss: 0.0965 - val_acc: 0.9795
Epoch 64/250
48000/48000 [==============================] - 3s - loss: 0.0275 - acc: 0.9906 - val_loss: 0.0959 - val_acc: 0.9793
Epoch 65/250
48000/48000 [==============================] - 3s - loss: 0.0278 - acc: 0.9910 - val_loss: 0.0999 - val_acc: 0.9803
Epoch 66/250
48000/48000 [==============================] - 3s - loss: 0.0273 - acc: 0.9910 - val_loss: 0.1055 - val_acc: 0.9789
Epoch 67/250
48000/48000 [==============================] - 3s - loss: 0.0259 - acc: 0.9916 - val_loss: 0.0980 - val_acc: 0.9797
Epoch 68/250
48000/48000 [==============================] - 3s - loss: 0.0257 - acc: 0.9915 - val_loss: 0.0942 - val_acc: 0.9803
Epoch 69/250
48000/48000 [==============================] - 3s - loss: 0.0255 - acc: 0.9915 - val_loss: 0.1037 - val_acc: 0.9791
Epoch 70/250
48000/48000 [==============================] - 3s - loss: 0.0257 - acc: 0.9909 - val_loss: 0.1055 - val_acc: 0.9784
Epoch 71/250
48000/48000 [==============================] - 3s - loss: 0.0252 - acc: 0.9919 - val_loss: 0.0963 - val_acc: 0.9797
Epoch 72/250
48000/48000 [==============================] - 3s - loss: 0.0272 - acc: 0.9912 - val_loss: 0.0996 - val_acc: 0.9797
Epoch 73/250
48000/48000 [==============================] - 3s - loss: 0.0241 - acc: 0.9924 - val_loss: 0.1063 - val_acc: 0.9781
Epoch 74/250
48000/48000 [==============================] - 3s - loss: 0.0256 - acc: 0.9915 - val_loss: 0.1039 - val_acc: 0.9792
Epoch 75/250
48000/48000 [==============================] - 3s - loss: 0.0266 - acc: 0.9919 - val_loss: 0.1040 - val_acc: 0.9804
Epoch 76/250
48000/48000 [==============================] - 3s - loss: 0.0246 - acc: 0.9917 - val_loss: 0.1014 - val_acc: 0.9796
Epoch 77/250
48000/48000 [==============================] - 3s - loss: 0.0251 - acc: 0.9919 - val_loss: 0.1045 - val_acc: 0.9794
Epoch 78/250
48000/48000 [==============================] - 3s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.1054 - val_acc: 0.9791
Epoch 79/250
48000/48000 [==============================] - 3s - loss: 0.0243 - acc: 0.9921 - val_loss: 0.1040 - val_acc: 0.9790
Epoch 80/250
48000/48000 [==============================] - 3s - loss: 0.0247 - acc: 0.9921 - val_loss: 0.1051 - val_acc: 0.9798
Epoch 81/250
48000/48000 [==============================] - 3s - loss: 0.0236 - acc: 0.9920 - val_loss: 0.1070 - val_acc: 0.9792
Epoch 82/250
48000/48000 [==============================] - 3s - loss: 0.0240 - acc: 0.9919 - val_loss: 0.1054 - val_acc: 0.9782
Epoch 83/250
48000/48000 [==============================] - 3s - loss: 0.0240 - acc: 0.9924 - val_loss: 0.0986 - val_acc: 0.9800
Epoch 84/250
48000/48000 [==============================] - 3s - loss: 0.0224 - acc: 0.9927 - val_loss: 0.1079 - val_acc: 0.9798
Epoch 85/250
48000/48000 [==============================] - 3s - loss: 0.0242 - acc: 0.9920 - val_loss: 0.0987 - val_acc: 0.9797
Epoch 86/250
48000/48000 [==============================] - 3s - loss: 0.0234 - acc: 0.9923 - val_loss: 0.1019 - val_acc: 0.9807
Epoch 87/250
48000/48000 [==============================] - 3s - loss: 0.0249 - acc: 0.9924 - val_loss: 0.0995 - val_acc: 0.9807
Epoch 88/250
48000/48000 [==============================] - 3s - loss: 0.0257 - acc: 0.9915 - val_loss: 0.1087 - val_acc: 0.9798
Epoch 89/250
48000/48000 [==============================] - 3s - loss: 0.0238 - acc: 0.9921 - val_loss: 0.1015 - val_acc: 0.9795
Epoch 90/250
48000/48000 [==============================] - 3s - loss: 0.0247 - acc: 0.9919 - val_loss: 0.1020 - val_acc: 0.9789
Epoch 91/250
48000/48000 [==============================] - 3s - loss: 0.0245 - acc: 0.9921 - val_loss: 0.1002 - val_acc: 0.9800
Epoch 92/250
48000/48000 [==============================] - 3s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0994 - val_acc: 0.9810
Epoch 93/250
48000/48000 [==============================] - 3s - loss: 0.0223 - acc: 0.9926 - val_loss: 0.1042 - val_acc: 0.9794
Epoch 94/250
48000/48000 [==============================] - 3s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.1038 - val_acc: 0.9804
Epoch 95/250
48000/48000 [==============================] - 3s - loss: 0.0221 - acc: 0.9926 - val_loss: 0.1045 - val_acc: 0.9797
Epoch 96/250
48000/48000 [==============================] - 3s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.1115 - val_acc: 0.9792
Epoch 97/250
48000/48000 [==============================] - 3s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.1157 - val_acc: 0.9783
Epoch 98/250
48000/48000 [==============================] - 3s - loss: 0.0223 - acc: 0.9923 - val_loss: 0.1050 - val_acc: 0.9804
Epoch 99/250
48000/48000 [==============================] - 3s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.1072 - val_acc: 0.9801
Epoch 100/250
48000/48000 [==============================] - 3s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.1079 - val_acc: 0.9793
Epoch 101/250
48000/48000 [==============================] - 3s - loss: 0.0214 - acc: 0.9930 - val_loss: 0.1077 - val_acc: 0.9801
Epoch 102/250
48000/48000 [==============================] - 3s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.1065 - val_acc: 0.9792
Epoch 103/250
48000/48000 [==============================] - 3s - loss: 0.0231 - acc: 0.9926 - val_loss: 0.1059 - val_acc: 0.9804
Epoch 104/250
48000/48000 [==============================] - 3s - loss: 0.0239 - acc: 0.9928 - val_loss: 0.1119 - val_acc: 0.9788
Epoch 105/250
48000/48000 [==============================] - 3s - loss: 0.0206 - acc: 0.9930 - val_loss: 0.1025 - val_acc: 0.9802
Epoch 106/250
48000/48000 [==============================] - 3s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.1040 - val_acc: 0.9797
Epoch 107/250
48000/48000 [==============================] - 3s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.1045 - val_acc: 0.9787
Epoch 108/250
48000/48000 [==============================] - 3s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.1043 - val_acc: 0.9792
Epoch 109/250
48000/48000 [==============================] - 3s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.1103 - val_acc: 0.9783
Epoch 110/250
48000/48000 [==============================] - 3s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.1145 - val_acc: 0.9791
Epoch 111/250
48000/48000 [==============================] - 3s - loss: 0.0223 - acc: 0.9925 - val_loss: 0.1050 - val_acc: 0.9803
Epoch 112/250
48000/48000 [==============================] - 3s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.1084 - val_acc: 0.9792
Epoch 113/250
48000/48000 [==============================] - 3s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.1037 - val_acc: 0.9800
Epoch 114/250
48000/48000 [==============================] - 3s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.1075 - val_acc: 0.9798
Epoch 115/250
48000/48000 [==============================] - 3s - loss: 0.0203 - acc: 0.9931 - val_loss: 0.1036 - val_acc: 0.9803
Epoch 116/250
48000/48000 [==============================] - 3s - loss: 0.0238 - acc: 0.9922 - val_loss: 0.1015 - val_acc: 0.9807
Epoch 117/250
48000/48000 [==============================] - 3s - loss: 0.0195 - acc: 0.9935 - val_loss: 0.1012 - val_acc: 0.9810
Epoch 118/250
48000/48000 [==============================] - 3s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.1015 - val_acc: 0.9818
Epoch 119/250
48000/48000 [==============================] - 3s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.1067 - val_acc: 0.9797
Epoch 120/250
48000/48000 [==============================] - 3s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.1056 - val_acc: 0.9794
Epoch 121/250
48000/48000 [==============================] - 3s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.1036 - val_acc: 0.9817
Epoch 122/250
48000/48000 [==============================] - 3s - loss: 0.0195 - acc: 0.9935 - val_loss: 0.1037 - val_acc: 0.9823
Epoch 123/250
48000/48000 [==============================] - 3s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.1137 - val_acc: 0.9797
Epoch 124/250
48000/48000 [==============================] - 3s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.1069 - val_acc: 0.9795
Epoch 125/250
48000/48000 [==============================] - 3s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.1084 - val_acc: 0.9800
Epoch 126/250
48000/48000 [==============================] - 3s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.1089 - val_acc: 0.9800
Epoch 127/250
48000/48000 [==============================] - 3s - loss: 0.0191 - acc: 0.9934 - val_loss: 0.1070 - val_acc: 0.9793
Epoch 128/250
48000/48000 [==============================] - 3s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.1038 - val_acc: 0.9807
Epoch 129/250
48000/48000 [==============================] - 3s - loss: 0.0195 - acc: 0.9935 - val_loss: 0.1103 - val_acc: 0.9794
Epoch 130/250
48000/48000 [==============================] - 3s - loss: 0.0191 - acc: 0.9936 - val_loss: 0.1077 - val_acc: 0.9809
Epoch 131/250
48000/48000 [==============================] - 3s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.1083 - val_acc: 0.9804
Epoch 132/250
48000/48000 [==============================] - 3s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.1059 - val_acc: 0.9802
Epoch 133/250
48000/48000 [==============================] - 3s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.1114 - val_acc: 0.9800
Epoch 134/250
48000/48000 [==============================] - 3s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.1100 - val_acc: 0.9807
Epoch 135/250
48000/48000 [==============================] - 3s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.1107 - val_acc: 0.9789
Epoch 136/250
48000/48000 [==============================] - 3s - loss: 0.0190 - acc: 0.9943 - val_loss: 0.1150 - val_acc: 0.9807
Epoch 137/250
48000/48000 [==============================] - 3s - loss: 0.0191 - acc: 0.9939 - val_loss: 0.1140 - val_acc: 0.9804
Epoch 138/250
48000/48000 [==============================] - 3s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.1154 - val_acc: 0.9794
Epoch 139/250
48000/48000 [==============================] - 3s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.1120 - val_acc: 0.9795
Epoch 140/250
48000/48000 [==============================] - 3s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.1053 - val_acc: 0.9810
Epoch 141/250
48000/48000 [==============================] - 3s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.1112 - val_acc: 0.9805
Epoch 142/250
48000/48000 [==============================] - 3s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.1150 - val_acc: 0.9785
Epoch 143/250
48000/48000 [==============================] - 3s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.1097 - val_acc: 0.9798
Epoch 144/250
48000/48000 [==============================] - 3s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.1105 - val_acc: 0.9797
Epoch 145/250
48000/48000 [==============================] - 3s - loss: 0.0191 - acc: 0.9936 - val_loss: 0.1080 - val_acc: 0.9800
Epoch 146/250
48000/48000 [==============================] - 3s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.1041 - val_acc: 0.9803
Epoch 147/250
48000/48000 [==============================] - 3s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.1042 - val_acc: 0.9803
Epoch 148/250
48000/48000 [==============================] - 3s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.1142 - val_acc: 0.9796
Epoch 149/250
48000/48000 [==============================] - 3s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.1115 - val_acc: 0.9798
Epoch 150/250
48000/48000 [==============================] - 3s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.1118 - val_acc: 0.9791
Epoch 151/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.1125 - val_acc: 0.9796
Epoch 152/250
48000/48000 [==============================] - 3s - loss: 0.0180 - acc: 0.9943 - val_loss: 0.1153 - val_acc: 0.9793
Epoch 153/250
48000/48000 [==============================] - 3s - loss: 0.0186 - acc: 0.9939 - val_loss: 0.1144 - val_acc: 0.9792
Epoch 154/250
48000/48000 [==============================] - 3s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.1126 - val_acc: 0.9798
Epoch 155/250
48000/48000 [==============================] - 3s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.1163 - val_acc: 0.9798
Epoch 156/250
48000/48000 [==============================] - 3s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.1135 - val_acc: 0.9793
Epoch 157/250
48000/48000 [==============================] - 3s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.1106 - val_acc: 0.9802
Epoch 158/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.1136 - val_acc: 0.9797
Epoch 159/250
48000/48000 [==============================] - 3s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.1115 - val_acc: 0.9797
Epoch 160/250
48000/48000 [==============================] - 3s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.1178 - val_acc: 0.9798
Epoch 161/250
48000/48000 [==============================] - 3s - loss: 0.0176 - acc: 0.9948 - val_loss: 0.1172 - val_acc: 0.9785
Epoch 162/250
48000/48000 [==============================] - 3s - loss: 0.0190 - acc: 0.9943 - val_loss: 0.1103 - val_acc: 0.9804
Epoch 163/250
48000/48000 [==============================] - 4s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.1211 - val_acc: 0.9793
Epoch 164/250
48000/48000 [==============================] - 4s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.1162 - val_acc: 0.9802
Epoch 165/250
48000/48000 [==============================] - 4s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.1108 - val_acc: 0.9802
Epoch 166/250
48000/48000 [==============================] - 3s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.1079 - val_acc: 0.9813
Epoch 167/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.1182 - val_acc: 0.9788
Epoch 168/250
48000/48000 [==============================] - 3s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.1074 - val_acc: 0.9812
Epoch 169/250
48000/48000 [==============================] - 3s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.1178 - val_acc: 0.9805
Epoch 170/250
48000/48000 [==============================] - 3s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.1174 - val_acc: 0.9800
Epoch 171/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.1171 - val_acc: 0.9799
Epoch 172/250
48000/48000 [==============================] - 4s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.1141 - val_acc: 0.9794
Epoch 173/250
48000/48000 [==============================] - 4s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.1149 - val_acc: 0.9800
Epoch 174/250
48000/48000 [==============================] - 4s - loss: 0.0159 - acc: 0.9945 - val_loss: 0.1197 - val_acc: 0.9797
Epoch 175/250
48000/48000 [==============================] - 4s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.1139 - val_acc: 0.9793
Epoch 176/250
48000/48000 [==============================] - 3s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.1166 - val_acc: 0.9800
Epoch 177/250
48000/48000 [==============================] - 3s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.1193 - val_acc: 0.9792
Epoch 178/250
48000/48000 [==============================] - 3s - loss: 0.0193 - acc: 0.9939 - val_loss: 0.1208 - val_acc: 0.9808
Epoch 179/250
48000/48000 [==============================] - 3s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.1204 - val_acc: 0.9788
Epoch 180/250
48000/48000 [==============================] - 3s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.1205 - val_acc: 0.9792
Epoch 181/250
48000/48000 [==============================] - 3s - loss: 0.0147 - acc: 0.9952 - val_loss: 0.1269 - val_acc: 0.9784
Epoch 182/250
48000/48000 [==============================] - 3s - loss: 0.0146 - acc: 0.9952 - val_loss: 0.1162 - val_acc: 0.9789
Epoch 183/250
48000/48000 [==============================] - 3s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.1150 - val_acc: 0.9799
Epoch 184/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.1192 - val_acc: 0.9792
Epoch 185/250
48000/48000 [==============================] - 3s - loss: 0.0195 - acc: 0.9943 - val_loss: 0.1160 - val_acc: 0.9802
Epoch 186/250
48000/48000 [==============================] - 3s - loss: 0.0160 - acc: 0.9948 - val_loss: 0.1236 - val_acc: 0.9790
Epoch 187/250
48000/48000 [==============================] - 3s - loss: 0.0159 - acc: 0.9952 - val_loss: 0.1252 - val_acc: 0.9785
Epoch 188/250
48000/48000 [==============================] - 4s - loss: 0.0179 - acc: 0.9943 - val_loss: 0.1327 - val_acc: 0.9782
Epoch 189/250
48000/48000 [==============================] - 4s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.1282 - val_acc: 0.9788
Epoch 190/250
48000/48000 [==============================] - 3s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.1257 - val_acc: 0.9787
Epoch 191/250
48000/48000 [==============================] - 4s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.1239 - val_acc: 0.9777
Epoch 192/250
48000/48000 [==============================] - 3s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.1182 - val_acc: 0.9798
Epoch 193/250
48000/48000 [==============================] - 3s - loss: 0.0147 - acc: 0.9952 - val_loss: 0.1177 - val_acc: 0.9800
Epoch 194/250
48000/48000 [==============================] - 3s - loss: 0.0141 - acc: 0.9953 - val_loss: 0.1268 - val_acc: 0.9784
Epoch 195/250
48000/48000 [==============================] - 3s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.1174 - val_acc: 0.9796
Epoch 196/250
48000/48000 [==============================] - 3s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.1218 - val_acc: 0.9790
Epoch 197/250
48000/48000 [==============================] - 3s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.1203 - val_acc: 0.9784
Epoch 198/250
48000/48000 [==============================] - 3s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.1272 - val_acc: 0.9780
Epoch 199/250
48000/48000 [==============================] - 3s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.1243 - val_acc: 0.9783
Epoch 200/250
48000/48000 [==============================] - 3s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.1242 - val_acc: 0.9791
Epoch 201/250
48000/48000 [==============================] - 3s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.1300 - val_acc: 0.9782
Epoch 202/250
48000/48000 [==============================] - 3s - loss: 0.0157 - acc: 0.9952 - val_loss: 0.1155 - val_acc: 0.9803
Epoch 203/250
48000/48000 [==============================] - 3s - loss: 0.0156 - acc: 0.9952 - val_loss: 0.1217 - val_acc: 0.9787
Epoch 204/250
48000/48000 [==============================] - 3s - loss: 0.0153 - acc: 0.9953 - val_loss: 0.1201 - val_acc: 0.9801
Epoch 205/250
48000/48000 [==============================] - 3s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.1184 - val_acc: 0.9789
Epoch 206/250
48000/48000 [==============================] - 3s - loss: 0.0168 - acc: 0.9943 - val_loss: 0.1194 - val_acc: 0.9799
Epoch 207/250
48000/48000 [==============================] - 3s - loss: 0.0165 - acc: 0.9952 - val_loss: 0.1224 - val_acc: 0.9785
Epoch 208/250
48000/48000 [==============================] - 3s - loss: 0.0178 - acc: 0.9943 - val_loss: 0.1217 - val_acc: 0.9793
Epoch 209/250
48000/48000 [==============================] - 3s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.1245 - val_acc: 0.9788
Epoch 210/250
48000/48000 [==============================] - 3s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.1222 - val_acc: 0.9797
Epoch 211/250
48000/48000 [==============================] - 3s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.1167 - val_acc: 0.9794
Epoch 212/250
48000/48000 [==============================] - 3s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.1209 - val_acc: 0.9798
Epoch 213/250
48000/48000 [==============================] - 3s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.1207 - val_acc: 0.9800
Epoch 214/250
48000/48000 [==============================] - 3s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.1227 - val_acc: 0.9784
Epoch 215/250
48000/48000 [==============================] - 3s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.1161 - val_acc: 0.9798
Epoch 216/250
48000/48000 [==============================] - 3s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.1234 - val_acc: 0.9783
Epoch 217/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9943 - val_loss: 0.1280 - val_acc: 0.9778
Epoch 218/250
48000/48000 [==============================] - 3s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.1193 - val_acc: 0.9804
Epoch 219/250
48000/48000 [==============================] - 3s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.1190 - val_acc: 0.9790
Epoch 220/250
48000/48000 [==============================] - 3s - loss: 0.0156 - acc: 0.9947 - val_loss: 0.1235 - val_acc: 0.9796
Epoch 221/250
48000/48000 [==============================] - 3s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.1236 - val_acc: 0.9786
Epoch 222/250
48000/48000 [==============================] - 3s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.1221 - val_acc: 0.9784
Epoch 223/250
48000/48000 [==============================] - 3s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.1331 - val_acc: 0.9773
Epoch 224/250
48000/48000 [==============================] - 3s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.1215 - val_acc: 0.9797
Epoch 225/250
48000/48000 [==============================] - 3s - loss: 0.0131 - acc: 0.9954 - val_loss: 0.1307 - val_acc: 0.9785
Epoch 226/250
48000/48000 [==============================] - 3s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.1334 - val_acc: 0.9785
Epoch 227/250
48000/48000 [==============================] - 4s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.1236 - val_acc: 0.9792
Epoch 228/250
48000/48000 [==============================] - 3s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.1207 - val_acc: 0.9796
Epoch 229/250
48000/48000 [==============================] - 3s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.1202 - val_acc: 0.9800
Epoch 230/250
48000/48000 [==============================] - 3s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.1213 - val_acc: 0.9791
Epoch 231/250
48000/48000 [==============================] - 3s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.1152 - val_acc: 0.9801
Epoch 232/250
48000/48000 [==============================] - 3s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.1300 - val_acc: 0.9782
Epoch 233/250
48000/48000 [==============================] - 3s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.1283 - val_acc: 0.9786
Epoch 234/250
48000/48000 [==============================] - 3s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.1200 - val_acc: 0.9799
Epoch 235/250
48000/48000 [==============================] - 3s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.1220 - val_acc: 0.9791
Epoch 236/250
48000/48000 [==============================] - 3s - loss: 0.0133 - acc: 0.9953 - val_loss: 0.1193 - val_acc: 0.9801
Epoch 237/250
48000/48000 [==============================] - 3s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.1167 - val_acc: 0.9799
Epoch 238/250
48000/48000 [==============================] - 3s - loss: 0.0150 - acc: 0.9957 - val_loss: 0.1304 - val_acc: 0.9784
Epoch 239/250
48000/48000 [==============================] - 3s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.1205 - val_acc: 0.9787
Epoch 240/250
48000/48000 [==============================] - 3s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.1220 - val_acc: 0.9801
Epoch 241/250
48000/48000 [==============================] - 3s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.1195 - val_acc: 0.9793
Epoch 242/250
48000/48000 [==============================] - 4s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.1223 - val_acc: 0.9789
Epoch 243/250
48000/48000 [==============================] - 3s - loss: 0.0153 - acc: 0.9952 - val_loss: 0.1218 - val_acc: 0.9795
Epoch 244/250
48000/48000 [==============================] - 3s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.1217 - val_acc: 0.9793
Epoch 245/250
48000/48000 [==============================] - 3s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.1228 - val_acc: 0.9792
Epoch 246/250
48000/48000 [==============================] - 4s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.1187 - val_acc: 0.9793
Epoch 247/250
48000/48000 [==============================] - 3s - loss: 0.0177 - acc: 0.9950 - val_loss: 0.1219 - val_acc: 0.9787
Epoch 248/250
48000/48000 [==============================] - 4s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.1228 - val_acc: 0.9796
Epoch 249/250
48000/48000 [==============================] - 4s - loss: 0.0155 - acc: 0.9957 - val_loss: 0.1179 - val_acc: 0.9795
Epoch 250/250
48000/48000 [==============================] - 4s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.1162 - val_acc: 0.9789
Step 9: Evaluate the model on the test dataset (10,000 images)
score = model.evaluate(X_test, Y_test, verbose=VERBOSE)
print("\nTest score:", score[0])
print('Test accuracy:', score[1])
9728/10000 [============================>.] - ETA: 0s
Test score: 0.114303596618
Test accuracy: 0.9811
Step 10: Plot the accuracy from history
print(history.history.keys())
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
dict_keys(['val_loss', 'loss', 'val_acc', 'acc'])
Step 11: Plot the loss from history
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
[Optional] Step 12: Save the model (serialized) to JSON
model_json = model.to_json()
with open("model.json", "w") as json_file:
json_file.write(model_json)
%ls
Volume in drive C is Windows
Volume Serial Number is 7252-C405
Directory of C:\Users\cobalt\workspace
09/17/2017 08:17 PM <DIR> .
09/17/2017 08:17 PM <DIR> ..
09/17/2017 07:50 PM <DIR> .ipynb_checkpoints
01/07/2017 12:22 PM <DIR> .metadata
09/17/2017 01:16 PM 50,073 DeepLearningHelloWorld.ipynb
09/17/2017 04:32 PM 56,578 DeepLearningHelloWorldPart2.ipynb
09/17/2017 06:47 PM 126,727 DeepLearningHelloWorldPart3.ipynb
09/17/2017 08:17 PM 126,846 DeepLearningHelloWorldPart4.ipynb
01/08/2017 10:52 AM <DIR> Hello
01/08/2017 10:52 AM <DIR> Hellocpp11
01/09/2017 04:45 PM <DIR> HelloOpenCV
09/17/2017 06:44 PM 492,240 model.h5
09/17/2017 08:18 PM 2,059 model.json
01/07/2017 12:22 PM <DIR> RemoteSystemsTempFiles
6 File(s) 854,523 bytes
8 Dir(s) 199,404,728,320 bytes free
[Optional] Step 13: Save the model weights
model.save_weights("model.h5")
%ls
Volume in drive C is Windows
Volume Serial Number is 7252-C405
Directory of C:\Users\cobalt\workspace
09/17/2017 08:17 PM <DIR> .
09/17/2017 08:17 PM <DIR> ..
09/17/2017 07:50 PM <DIR> .ipynb_checkpoints
01/07/2017 12:22 PM <DIR> .metadata
09/17/2017 01:16 PM 50,073 DeepLearningHelloWorld.ipynb
09/17/2017 04:32 PM 56,578 DeepLearningHelloWorldPart2.ipynb
09/17/2017 06:47 PM 126,727 DeepLearningHelloWorldPart3.ipynb
09/17/2017 08:17 PM 126,846 DeepLearningHelloWorldPart4.ipynb
01/08/2017 10:52 AM <DIR> Hello
01/08/2017 10:52 AM <DIR> Hellocpp11
01/09/2017 04:45 PM <DIR> HelloOpenCV
09/17/2017 08:18 PM 492,240 model.h5
09/17/2017 08:18 PM 2,059 model.json
01/07/2017 12:22 PM <DIR> RemoteSystemsTempFiles
6 File(s) 854,523 bytes
8 Dir(s) 199,410,405,376 bytes free
[Optional] Step 14: Load the saved model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
loaded_model.load_weights("model.h5")
[Optional] Step 15: Compile and evaluate loaded model
loaded_model.compile(loss='categorical_crossentropy',
optimizer=OPTIMIZER,
metrics=['accuracy'])
score = loaded_model.evaluate(X_test, Y_test, verbose=VERBOSE)
print("\nTest score:", score[0])
print('Test accuracy:', score[1])
9760/10000 [============================>.] - ETA: 0s
Test score: 0.114303596618
Test accuracy: 0.9811
- mkc