# Pattern Recognition¶

In [1]:
from ipypublish import nb_setup


Pattern Recognition is the task of classifying an image into one of several different categories. Since their inception, Pattern Recognition is the most common problem that NNs have been used for, and over the years the increase in classification accuracy has served as an indicator of the state of the art in NN design. The MNIST pattern recognition problem served as a benchmark for DLN systems until recently, and in this section we introduce DLNs by describing the problem and the way in which they are used to solve it.

## MNIST¶

In [2]:
#Mnist
nb_setup.images_hconcat(["DL_images/mnist.png"], width=600)

Out[2]:

The MNIST database consists of 70,000 scanned images of handwritten digits from 0 to 9, a few of which are shown in Figure Mnist. Each image has been digitized into a $28 \times 28$ grid, with each of the 784 pixels in the grid assigned a quantized grayscale value between 0 and 1, with 0 representing white and 1 representing black. These numbers are then fed into a DLN as shown in Figure mnistNN. We will go into the details of the DLN architecture in Chapter NNDeepLearning, but at a high level the reader can observe that the DLN shown in Figure mnistNN has three layers of nodes (or neurons). The layer on the left is the Input Layer and consists of 784 neurons, with each neuron assigned a number with the grayscale value of the corresponding pixel from the input image. Note that the 2-dimensional image has been stretched out into a 1-demensional vector before being fed into the DLN. The middle layer is called the Hidden Layer and consists of 15 neurons, while the third layer is called the Output Layer and consists of 10 neurons. As the name implies, the Output Layer indicates the output of the DLN computation, so that ideally if the input corresponds to the digit $k$, $0 \leq k \leq 9$ , then the $k^{th}$ output neuron should be 1, and the other 9 should be zero.

In [3]:
#mnistNN
nb_setup.images_hconcat(["DL_images/mnistNN.png"], width=600)

Out[3]:

The neurons in the Input and Hidden Layers are fully connected which implies that each neuron in the input layer is connected to every neuron in the Hidden Layer (and the same goes for the neurons between the Hidden and Output Layers). Note that these connections are uni-directional, i.e. exist in the forward direction only. Each of these connections is assigned a weight and each node in the Hidden and Output layers is assigned a bias, so that there are a total of $784 \times 15 + 15 \times 10 +15 +10 = 11,935$ weight + bias parameters needed to describe the network.

In an operational network, these 11,935 weight and bias parameters have to be set, so that the DLN is able to do its job. The process of choosing these parameters is known as "training the network" and uses a learning algorithm that proceeds by iteration. After the training is complete, the DLN should be able to classify images of digits that were not part of the training dataset, which is known as the networks "Generalization Ability".

The system operates as follows:

• The 70,000 images in the MNIST database are divided into 3 groups: 50,000 images are used for training the DLN (called the training dataset), 10,000 images are used for choosing the modelâ€™s hyper-parameters (called the validation dataset) and the remaining 10,000 images are used for testing (called the test dataset).

• The training process operates as follows:

1. The grayscale values for an image in the training data set are fed into the Input Layer. The signals generated by this propagate through the network, called forward propagation, and the values at the Output Layer are compared with the desired output (for example if the image is of the digit 2, then the desired output should be 0100000000). A measure of the difference between the desired and actual values is then fed back into the network and propagates back, using an algorithm known as Backprop. The information gleaned from this process is then used to modify all the link weights and node bias values, so that the desired and actual outputs are closer in the next iteration.

2. The process described above is repeated for each of the 50,000 images in the training set, which is known as a training Epoch. The network may be trained for multiple epochs until a stopping condition is satisfied, usually the error rate on the Validation data set should fall below some threshold.

3. Other than the weights and biases, there are some other important model parameters that are part of the training process, known as hyper-parameters. Some of these hyper-parameters are used to improve the optimization algorithm during training, while others are used to improve the modelâ€™s generalization ability. The main function of the validation dataset is to choose appropriate values for these hyper-parameters.

• After the network is fully trained, the 10,000 images in the test dataset are used to test the DLNâ€™s classification accuracy.

## Epoch Accuracy¶

In [4]:
#epochAccuracy
nb_setup.images_hconcat(["DL_images/epochAccuracy.png"], width=1000)

Out[4]:

Figure epochAccuracy plots the accuracy of the classification process as a function of the number of Epochs using the test data set. As can be seen, the classification accuracy increases almost linearly initially, but after about 260 Epochs, the classification accuracy does not increase beyond 82.25% or so (in other words the NN classifies about 8225 images correctly out of a total of 10,000 Test Images). The reasons why the testing accuracy plateaus out and what can be done to increase it further forms the subject of Section ImprovingModelGeneralization. In practice, the best accuracy that has been achieved by a state of the art NN on the MNIST classification problem is about 99.67%, i.e., only 33 mis-classifications out of 10,000!

Figure mnistNN also provides some insight into how the DLN is able to carry out the classification task, for the example in which the input is a handwritten zero. In a trained NN, four of the nodes in the Hidden layer are tuned to recognize the presence of dark pixels in certain parts of the image, as shown in the bottom of the figure. This is done by appropriately choosing the weights on the links between the Input and Hidden layers, which is also called filtering. As shown in the figure, the output node that corresponds to the digit 0, filters these 4 Hidden layer nodes (by setting the weights on the links between the Hidden and Output layers), such that its own output tends towards 1, while the outputs of the other nodes in the Output layer tend towards 0.

# ILSVRC¶

nb_setup.images_hconcat(["DL_images/ILSVRC.png"], width=600)

While the MNIST data set played an important role in the early years of Deep Learning, current systems have become powerful enough to be able to handle much more complex image classification tests. ImageNet is an on-line data set consisting of 16 million full color images obtained by crawling the web. These images have been labeled using Amazonâ€™s Mechanical Turk service, and some example are given in Figure ILSVRC. A popular Machine Learning competition called ImageNet Large-Scale Visual Recognition Challenge (ILSVRC) uses a 1.2 million subset of these images, drawn from 1,000 different categories, with 50,000 images used for validation and 150,000 for testing. In recent years the ILSVRC competition has served as a benchmark for the best DLN models. In recent years the performance of DLN models has exceeded that of a human test subject for this problem.

## Keras Implementation of the MNIST Classifier¶

There are several frameworks that have emerged in the last few years for implementing DLN models. One of the most popular is Keras, which is built on top of an earlier framework called TensorFlow (both are from Google). We show an implementation of the MNIST classifier using Keras.

In [5]:
import keras
keras.__version__
from keras import models
from keras import layers


The MNIST dataset comes pre-loaded in Keras, and the next command imports it into the model. The raw MNIST images are in a format such as .png or .jpg, however these have been pre-processed by Keras into the Grayscale format, so that each pixel is an integer between 0 and 255. Furthermore all the images have been converted into a tensor format that can be processed using Keras. Also note that the images have been already split into training and test datasets.

In [6]:
from keras.datasets import mnist

(train_images, train_labels), (test_images, test_labels) = mnist.load_data()


Using the shape command, we obtain the dimensions of the tensor containing the training data. As shown below, the training dataset is a 3 dimensional tensor, with the first dimension representing the number of training images, and the next two dimensions representing the size of each image. Each image is a two dimensional tensor of size 28 x 28.

In [7]:
train_images.shape

Out[7]:
(60000, 28, 28)

The labels for the training data form an array of size 60,000. An element of this array is an integer label for the corresponding image.

In [8]:
len(train_labels)

Out[8]:
60000
In [9]:
train_labels

Out[9]:
array([5, 0, 4, ..., 5, 6, 8], dtype=uint8)

Similarly for test images:

In [10]:
test_images.shape

Out[10]:
(10000, 28, 28)
In [11]:
len(test_labels)

Out[11]:
10000
In [12]:
test_labels

Out[12]:
array([7, 2, 1, ..., 4, 5, 6], dtype=uint8)

We can also plot one of the training images by using the matplotlib commnd:

In [14]:
digit = train_images[4]

import matplotlib.pyplot as plt
import numpy as np
plt.imshow(digit, cmap = plt.cm.binary)
plt.show()


The contents of this image cal also be displayed in matrix form. Note that each pixel is represented in the Grayscale format by an integer in the range 0 and 255.

In [15]:
digit

Out[15]:
array([[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,  55,
148, 210, 253, 253, 113,  87, 148,  55,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,  87, 232,
252, 253, 189, 210, 252, 252, 253, 168,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   4,  57, 242, 252,
190,  65,   5,  12, 182, 252, 253, 116,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,  96, 252, 252, 183,
14,   0,   0,  92, 252, 252, 225,  21,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0, 132, 253, 252, 146,  14,
0,   0,   0, 215, 252, 252,  79,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0, 126, 253, 247, 176,   9,   0,
0,   8,  78, 245, 253, 129,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,  16, 232, 252, 176,   0,   0,   0,
36, 201, 252, 252, 169,  11,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,  22, 252, 252,  30,  22, 119, 197,
241, 253, 252, 251,  77,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,  16, 231, 252, 253, 252, 252, 252,
226, 227, 252, 231,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,  55, 235, 253, 217, 138,  42,
24, 192, 252, 143,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
62, 255, 253, 109,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
71, 253, 252,  21,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0, 253, 252,  21,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
71, 253, 252,  21,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
106, 253, 252,  21,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
45, 255, 253,  21,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0, 218, 252,  56,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,  96, 252, 189,  42,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,  14, 184, 252, 170,  11,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,  14, 147, 252,  42,   0,   0,   0,   0,   0,   0,   0,
0,   0],
[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
0,   0]], dtype=uint8)

The Grayscale formatted image tensors cannot be fed directly into the model, but need to be pre-processed, as follows:

1. The DLN model in Figure mnistNN can only accept input that is in the form of an 1-D array, or a vector. So we need to convery the 2-D tensors into a vector, which is done using the numpy reshape command.

2. The data being fed into the NN has to be normalized so that all numbers are around zero in magnitude, otherwise the training does not work very well. This is done by dividing each of the Grayscale formatted pixel values by 255, so that they lie in the range [0,1] after normalization.

In [16]:
train_images = train_images.reshape((60000, 28 * 28))
train_images = train_images.astype('float32') / 255

test_images = test_images.reshape((10000, 28 * 28))
test_images = test_images.astype('float32') / 255


After re-shaping, the training dataset now consists of 60,000 images, each of which is a normalized vector of size 784.

In [17]:
train_images.shape

Out[17]:
(60000, 784)

There are two ways in which the output label can be specified in Keras:

1. Sparse Categorical: The label can be specified as an integer, as is the case for the MNIST data stored in Keras.
2. Categorical: In this case labels are specified in the 1-Hot Encoding format. For example if the label is the integer 'k' in the sparse categorical format, then it becomes an array in the categorical format, such that the $k^{th}$ entry in the array is 1, and all the other entries are 0.

The following commands convert all the labels in the training and test datasets to the categorical format.

In [19]:
from tensorflow.keras.utils import to_categorical

train_labels = to_categorical(train_labels)
test_labels = to_categorical(test_labels)

train_labels

Out[19]:
array([[0., 0., 0., ..., 0., 0., 0.],
[1., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
...,
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 1., 0.]], dtype=float32)

The NN model itself is specified using only 3 lines of Keras code.

• The first line specifies that this is a sequential model. In later chapters we will come across models in which the data flow is non-sequential, with feed-forward or feed-back loops.
• The second line specifies the Hidden Layer, with 15 nodes and "ReLu" Activation (Activations functions are introduced in a later chapter). We also have to specify the shape of the input data into this layer, which is an array of size 764. The "Dense" keyword tells Keras that this is a Fully Connected layer.
• The third line specifies the Output Layer, with 10 nodes and the "Softmax" Activation.

Note that more layers can be added very simply to this model by repeating the command in Line 2.

In [20]:
network = models.Sequential()


The compile command uses the following arguments:

• Specifying the Loss Function: The Loss Function is a measure of the difference between the output of the model vs the contents of the corresponding Label. The "Categorical Cross Entopy" Loss Function will be used. Note that if we had left the labels in the Sparse Categorical format, then the Loss Function "Sparse Categorical Cross Entropy" would be needed.

• Specifying the Optimization Algorithm: In this case we will use 'sgd' which stands for Stochastic Gradient Descent. These algorithms are all based on Backprop, which is an efficient way of computing gradients.

• Specifying the output metrics: This gives Keras the list of output metrics to collect. In this case we collect the "accuracy" metric, which is one of the Keras specified metrics. User defined metrics can also be specified here.

In [21]:
network.compile(optimizer='sgd',
loss='categorical_crossentropy',
metrics=['accuracy'])


We are finally ready to train the model, which is done using the fit command. The fit command is invoked using the following arguments:

• The list of Training Images and Training Labels
• The number of epochs the training should be done for. Recall that each epoch corresponds to passing the entire training dataset through the model
• Typically data is not fed into the model one at a time, but in batches of size batch_size. This is done in order to improve the convergence properties of the algorithm
• The validation_split value allows the system to set aside the specified fraction of training data for validation purposes

Once the training starts, Keras provides a periodic update at the end of each epoch. This update contains the time it took to finish the epoch as well as the trailing and validation loss and accuracy values.

In [22]:
history = network.fit(train_images, train_labels, epochs=500, batch_size=128, validation_split=0.2)

Epoch 1/500
375/375 [==============================] - 1s 2ms/step - loss: 1.6855 - accuracy: 0.5025 - val_loss: 1.0881 - val_accuracy: 0.7651
Epoch 2/500
375/375 [==============================] - 1s 2ms/step - loss: 0.8615 - accuracy: 0.7945 - val_loss: 0.6550 - val_accuracy: 0.8452
Epoch 3/500
375/375 [==============================] - 1s 2ms/step - loss: 0.6096 - accuracy: 0.8435 - val_loss: 0.5131 - val_accuracy: 0.8677
Epoch 4/500
375/375 [==============================] - 1s 2ms/step - loss: 0.5084 - accuracy: 0.8640 - val_loss: 0.4461 - val_accuracy: 0.8799
Epoch 5/500
375/375 [==============================] - 1s 2ms/step - loss: 0.4545 - accuracy: 0.8767 - val_loss: 0.4076 - val_accuracy: 0.8869
Epoch 6/500
375/375 [==============================] - 1s 2ms/step - loss: 0.4205 - accuracy: 0.8843 - val_loss: 0.3814 - val_accuracy: 0.8920
Epoch 7/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3969 - accuracy: 0.8893 - val_loss: 0.3643 - val_accuracy: 0.8969
Epoch 8/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3797 - accuracy: 0.8929 - val_loss: 0.3494 - val_accuracy: 0.9009
Epoch 9/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3663 - accuracy: 0.8961 - val_loss: 0.3388 - val_accuracy: 0.9046
Epoch 10/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3556 - accuracy: 0.8994 - val_loss: 0.3297 - val_accuracy: 0.9078
Epoch 11/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3466 - accuracy: 0.9019 - val_loss: 0.3229 - val_accuracy: 0.9095
Epoch 12/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3391 - accuracy: 0.9043 - val_loss: 0.3171 - val_accuracy: 0.9107
Epoch 13/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3325 - accuracy: 0.9051 - val_loss: 0.3114 - val_accuracy: 0.9130
Epoch 14/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3270 - accuracy: 0.9070 - val_loss: 0.3068 - val_accuracy: 0.9132
Epoch 15/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3218 - accuracy: 0.9082 - val_loss: 0.3029 - val_accuracy: 0.9147
Epoch 16/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3172 - accuracy: 0.9092 - val_loss: 0.2990 - val_accuracy: 0.9150
Epoch 17/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3132 - accuracy: 0.9104 - val_loss: 0.2957 - val_accuracy: 0.9159
Epoch 18/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3096 - accuracy: 0.9119 - val_loss: 0.2931 - val_accuracy: 0.9170
Epoch 19/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3061 - accuracy: 0.9125 - val_loss: 0.2898 - val_accuracy: 0.9172
Epoch 20/500
375/375 [==============================] - 1s 2ms/step - loss: 0.3028 - accuracy: 0.9141 - val_loss: 0.2871 - val_accuracy: 0.9193
Epoch 21/500
375/375 [==============================] - ETA: 0s - loss: 0.2996 - accuracy: 0.9147 ETA: 0s - loss: 0.3014 - accura - 1s 2ms/step - loss: 0.2998 - accuracy: 0.9149 - val_loss: 0.2847 - val_accuracy: 0.9194
Epoch 22/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2970 - accuracy: 0.9162 - val_loss: 0.2825 - val_accuracy: 0.9199
Epoch 23/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2944 - accuracy: 0.9164 - val_loss: 0.2803 - val_accuracy: 0.9203
Epoch 24/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2919 - accuracy: 0.9168 - val_loss: 0.2786 - val_accuracy: 0.9216
Epoch 25/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2895 - accuracy: 0.9182 - val_loss: 0.2769 - val_accuracy: 0.9213
Epoch 26/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2872 - accuracy: 0.9186 - val_loss: 0.2749 - val_accuracy: 0.9234
Epoch 27/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2850 - accuracy: 0.9193 - val_loss: 0.2730 - val_accuracy: 0.9237
Epoch 28/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2830 - accuracy: 0.9204 - val_loss: 0.2710 - val_accuracy: 0.9238
Epoch 29/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2807 - accuracy: 0.9203 - val_loss: 0.2701 - val_accuracy: 0.9237
Epoch 30/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2790 - accuracy: 0.9216 - val_loss: 0.2683 - val_accuracy: 0.9239
Epoch 31/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2769 - accuracy: 0.9224 - val_loss: 0.2669 - val_accuracy: 0.9240
Epoch 32/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2752 - accuracy: 0.9222 - val_loss: 0.2652 - val_accuracy: 0.9250
Epoch 33/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2733 - accuracy: 0.9229 - val_loss: 0.2641 - val_accuracy: 0.9251
Epoch 34/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2715 - accuracy: 0.9235 - val_loss: 0.2632 - val_accuracy: 0.9247
Epoch 35/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2698 - accuracy: 0.9234 - val_loss: 0.2611 - val_accuracy: 0.9261
Epoch 36/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2681 - accuracy: 0.9243 - val_loss: 0.2604 - val_accuracy: 0.9270
Epoch 37/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2665 - accuracy: 0.9250 - val_loss: 0.2589 - val_accuracy: 0.9271
Epoch 38/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2649 - accuracy: 0.9253 - val_loss: 0.2580 - val_accuracy: 0.9267
Epoch 39/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2634 - accuracy: 0.9259 - val_loss: 0.2562 - val_accuracy: 0.9270
Epoch 40/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2617 - accuracy: 0.9264 - val_loss: 0.2548 - val_accuracy: 0.9278
Epoch 41/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2603 - accuracy: 0.9270 - val_loss: 0.2541 - val_accuracy: 0.9273
Epoch 42/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2587 - accuracy: 0.9269 - val_loss: 0.2528 - val_accuracy: 0.9268
Epoch 43/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2573 - accuracy: 0.9275 - val_loss: 0.2518 - val_accuracy: 0.9283
Epoch 44/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2557 - accuracy: 0.9276 - val_loss: 0.2513 - val_accuracy: 0.9287
Epoch 45/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2544 - accuracy: 0.9283 - val_loss: 0.2495 - val_accuracy: 0.9282
Epoch 46/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2529 - accuracy: 0.9289 - val_loss: 0.2483 - val_accuracy: 0.9287
Epoch 47/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2515 - accuracy: 0.9291 - val_loss: 0.2475 - val_accuracy: 0.9293
Epoch 48/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2501 - accuracy: 0.9297 - val_loss: 0.2466 - val_accuracy: 0.9289
Epoch 49/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2486 - accuracy: 0.9298 - val_loss: 0.2457 - val_accuracy: 0.9285
Epoch 50/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2474 - accuracy: 0.9298 - val_loss: 0.2441 - val_accuracy: 0.9299
Epoch 51/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2460 - accuracy: 0.9306 - val_loss: 0.2436 - val_accuracy: 0.9307
Epoch 52/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2446 - accuracy: 0.9311 - val_loss: 0.2428 - val_accuracy: 0.9297
Epoch 53/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2434 - accuracy: 0.9308 - val_loss: 0.2422 - val_accuracy: 0.9302
Epoch 54/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2420 - accuracy: 0.9316 - val_loss: 0.2407 - val_accuracy: 0.9304
Epoch 55/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2408 - accuracy: 0.9320 - val_loss: 0.2394 - val_accuracy: 0.9315
Epoch 56/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2395 - accuracy: 0.9322 - val_loss: 0.2382 - val_accuracy: 0.9312
Epoch 57/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2383 - accuracy: 0.9327 - val_loss: 0.2377 - val_accuracy: 0.9308
Epoch 58/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2372 - accuracy: 0.9328 - val_loss: 0.2365 - val_accuracy: 0.9314
Epoch 59/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2359 - accuracy: 0.9334 - val_loss: 0.2357 - val_accuracy: 0.9315
Epoch 60/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2348 - accuracy: 0.9341 - val_loss: 0.2351 - val_accuracy: 0.9313
Epoch 61/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2335 - accuracy: 0.9340 - val_loss: 0.2343 - val_accuracy: 0.9316
Epoch 62/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2323 - accuracy: 0.9349 - val_loss: 0.2333 - val_accuracy: 0.9316
Epoch 63/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2313 - accuracy: 0.9352 - val_loss: 0.2324 - val_accuracy: 0.9331
Epoch 64/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2301 - accuracy: 0.9351 - val_loss: 0.2312 - val_accuracy: 0.9323
Epoch 65/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2291 - accuracy: 0.9354 - val_loss: 0.2311 - val_accuracy: 0.9332
Epoch 66/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2279 - accuracy: 0.9358 - val_loss: 0.2309 - val_accuracy: 0.9337
Epoch 67/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2269 - accuracy: 0.9362 - val_loss: 0.2293 - val_accuracy: 0.9330
Epoch 68/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2257 - accuracy: 0.9361 - val_loss: 0.2291 - val_accuracy: 0.9327
Epoch 69/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2247 - accuracy: 0.9369 - val_loss: 0.2278 - val_accuracy: 0.9343
Epoch 70/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2238 - accuracy: 0.9369 - val_loss: 0.2270 - val_accuracy: 0.9342
Epoch 71/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2226 - accuracy: 0.9375 - val_loss: 0.2268 - val_accuracy: 0.9336
Epoch 72/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2218 - accuracy: 0.9374 - val_loss: 0.2256 - val_accuracy: 0.9346
Epoch 73/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2208 - accuracy: 0.9380 - val_loss: 0.2253 - val_accuracy: 0.9350
Epoch 74/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2198 - accuracy: 0.9379 - val_loss: 0.2247 - val_accuracy: 0.9343
Epoch 75/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2189 - accuracy: 0.9376 - val_loss: 0.2238 - val_accuracy: 0.9358
Epoch 76/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2180 - accuracy: 0.9388 - val_loss: 0.2236 - val_accuracy: 0.9359
Epoch 77/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2171 - accuracy: 0.9388 - val_loss: 0.2228 - val_accuracy: 0.9348
Epoch 78/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2161 - accuracy: 0.9391 - val_loss: 0.2220 - val_accuracy: 0.9354
Epoch 79/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2154 - accuracy: 0.9390 - val_loss: 0.2213 - val_accuracy: 0.9361
Epoch 80/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2143 - accuracy: 0.9395 - val_loss: 0.2215 - val_accuracy: 0.9370
Epoch 81/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2136 - accuracy: 0.9393 - val_loss: 0.2209 - val_accuracy: 0.9362
Epoch 82/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2127 - accuracy: 0.9398 - val_loss: 0.2198 - val_accuracy: 0.9374
Epoch 83/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2119 - accuracy: 0.9398 - val_loss: 0.2200 - val_accuracy: 0.9364
Epoch 84/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2112 - accuracy: 0.9403 - val_loss: 0.2194 - val_accuracy: 0.9378
Epoch 85/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2103 - accuracy: 0.9404 - val_loss: 0.2186 - val_accuracy: 0.9376
Epoch 86/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2096 - accuracy: 0.9404 - val_loss: 0.2177 - val_accuracy: 0.9383
Epoch 87/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2088 - accuracy: 0.9411 - val_loss: 0.2176 - val_accuracy: 0.9380
Epoch 88/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2080 - accuracy: 0.9413 - val_loss: 0.2171 - val_accuracy: 0.9380
Epoch 89/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2073 - accuracy: 0.9413 - val_loss: 0.2171 - val_accuracy: 0.9382
Epoch 90/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2066 - accuracy: 0.9417 - val_loss: 0.2160 - val_accuracy: 0.9383
Epoch 91/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2059 - accuracy: 0.9419 - val_loss: 0.2157 - val_accuracy: 0.9389
Epoch 92/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2051 - accuracy: 0.9421 - val_loss: 0.2152 - val_accuracy: 0.9392
Epoch 93/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2044 - accuracy: 0.9422 - val_loss: 0.2151 - val_accuracy: 0.9394
Epoch 94/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2037 - accuracy: 0.9427 - val_loss: 0.2140 - val_accuracy: 0.9391
Epoch 95/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2029 - accuracy: 0.9429 - val_loss: 0.2138 - val_accuracy: 0.9404
Epoch 96/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2023 - accuracy: 0.9429 - val_loss: 0.2139 - val_accuracy: 0.9400
Epoch 97/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2016 - accuracy: 0.9432 - val_loss: 0.2136 - val_accuracy: 0.9396
Epoch 98/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2011 - accuracy: 0.9430 - val_loss: 0.2127 - val_accuracy: 0.9397
Epoch 99/500
375/375 [==============================] - 1s 2ms/step - loss: 0.2003 - accuracy: 0.9433 - val_loss: 0.2124 - val_accuracy: 0.9398
Epoch 100/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1997 - accuracy: 0.9438 - val_loss: 0.2123 - val_accuracy: 0.9393
Epoch 101/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1989 - accuracy: 0.9442 - val_loss: 0.2121 - val_accuracy: 0.9396
Epoch 102/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1984 - accuracy: 0.9441 - val_loss: 0.2113 - val_accuracy: 0.9402
Epoch 103/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1975 - accuracy: 0.9442 - val_loss: 0.2117 - val_accuracy: 0.9408
Epoch 104/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1970 - accuracy: 0.9447 - val_loss: 0.2112 - val_accuracy: 0.9413
Epoch 105/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1965 - accuracy: 0.9446 - val_loss: 0.2113 - val_accuracy: 0.9402
Epoch 106/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1959 - accuracy: 0.9447 - val_loss: 0.2102 - val_accuracy: 0.9404
Epoch 107/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1953 - accuracy: 0.9452 - val_loss: 0.2098 - val_accuracy: 0.9409
Epoch 108/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1945 - accuracy: 0.9451 - val_loss: 0.2093 - val_accuracy: 0.9412
Epoch 109/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1940 - accuracy: 0.9451 - val_loss: 0.2091 - val_accuracy: 0.9410
Epoch 110/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1934 - accuracy: 0.9452 - val_loss: 0.2090 - val_accuracy: 0.9413
Epoch 111/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1928 - accuracy: 0.9456 - val_loss: 0.2086 - val_accuracy: 0.9414
Epoch 112/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1922 - accuracy: 0.9458 - val_loss: 0.2088 - val_accuracy: 0.9417
Epoch 113/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1916 - accuracy: 0.9461 - val_loss: 0.2084 - val_accuracy: 0.9420
Epoch 114/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1909 - accuracy: 0.9462 - val_loss: 0.2080 - val_accuracy: 0.9412
Epoch 115/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1904 - accuracy: 0.9459 - val_loss: 0.2074 - val_accuracy: 0.9424
Epoch 116/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1898 - accuracy: 0.9472 - val_loss: 0.2072 - val_accuracy: 0.9418
Epoch 117/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1892 - accuracy: 0.9467 - val_loss: 0.2079 - val_accuracy: 0.9420
Epoch 118/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1888 - accuracy: 0.9465 - val_loss: 0.2068 - val_accuracy: 0.9418
Epoch 119/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1882 - accuracy: 0.9471 - val_loss: 0.2062 - val_accuracy: 0.9418
Epoch 120/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1877 - accuracy: 0.9471 - val_loss: 0.2064 - val_accuracy: 0.9417
Epoch 121/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1871 - accuracy: 0.9474 - val_loss: 0.2064 - val_accuracy: 0.9423
Epoch 122/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1867 - accuracy: 0.9474 - val_loss: 0.2057 - val_accuracy: 0.9436
Epoch 123/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1862 - accuracy: 0.9475 - val_loss: 0.2056 - val_accuracy: 0.9433
Epoch 124/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1857 - accuracy: 0.9474 - val_loss: 0.2048 - val_accuracy: 0.9427
Epoch 125/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1852 - accuracy: 0.9477 - val_loss: 0.2046 - val_accuracy: 0.9427
Epoch 126/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1845 - accuracy: 0.9480 - val_loss: 0.2048 - val_accuracy: 0.9427
Epoch 127/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1843 - accuracy: 0.9477 - val_loss: 0.2040 - val_accuracy: 0.9434
Epoch 128/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1836 - accuracy: 0.9479 - val_loss: 0.2043 - val_accuracy: 0.9436
Epoch 129/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1832 - accuracy: 0.9482 - val_loss: 0.2036 - val_accuracy: 0.9428
Epoch 130/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1827 - accuracy: 0.9490 - val_loss: 0.2049 - val_accuracy: 0.9432
Epoch 131/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1824 - accuracy: 0.9484 - val_loss: 0.2031 - val_accuracy: 0.9430
Epoch 132/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1818 - accuracy: 0.9485 - val_loss: 0.2031 - val_accuracy: 0.9427
Epoch 133/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1814 - accuracy: 0.9490 - val_loss: 0.2033 - val_accuracy: 0.9436
Epoch 134/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1809 - accuracy: 0.9489 - val_loss: 0.2028 - val_accuracy: 0.9437
Epoch 135/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1805 - accuracy: 0.9494 - val_loss: 0.2024 - val_accuracy: 0.9440
Epoch 136/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1799 - accuracy: 0.9495 - val_loss: 0.2030 - val_accuracy: 0.9439
Epoch 137/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1796 - accuracy: 0.9494 - val_loss: 0.2025 - val_accuracy: 0.9438
Epoch 138/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1789 - accuracy: 0.9495 - val_loss: 0.2023 - val_accuracy: 0.9437
Epoch 139/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1787 - accuracy: 0.9498 - val_loss: 0.2019 - val_accuracy: 0.9436
Epoch 140/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1783 - accuracy: 0.9499 - val_loss: 0.2016 - val_accuracy: 0.9442
Epoch 141/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1777 - accuracy: 0.9502 - val_loss: 0.2023 - val_accuracy: 0.9438
Epoch 142/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1775 - accuracy: 0.9500 - val_loss: 0.2009 - val_accuracy: 0.9447
Epoch 143/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1770 - accuracy: 0.9500 - val_loss: 0.2014 - val_accuracy: 0.9445
Epoch 144/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1766 - accuracy: 0.9503 - val_loss: 0.2010 - val_accuracy: 0.9449
Epoch 145/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1762 - accuracy: 0.9506 - val_loss: 0.2012 - val_accuracy: 0.9439
Epoch 146/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1757 - accuracy: 0.9506 - val_loss: 0.2013 - val_accuracy: 0.9448
Epoch 147/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1755 - accuracy: 0.9508 - val_loss: 0.2000 - val_accuracy: 0.9451
Epoch 148/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1751 - accuracy: 0.9507 - val_loss: 0.2007 - val_accuracy: 0.9452
Epoch 149/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1745 - accuracy: 0.9509 - val_loss: 0.2002 - val_accuracy: 0.9451
Epoch 150/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1743 - accuracy: 0.9507 - val_loss: 0.2000 - val_accuracy: 0.9445
Epoch 151/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1738 - accuracy: 0.9511 - val_loss: 0.2006 - val_accuracy: 0.9443
Epoch 152/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1735 - accuracy: 0.9515 - val_loss: 0.1996 - val_accuracy: 0.9452
Epoch 153/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1730 - accuracy: 0.9513 - val_loss: 0.1994 - val_accuracy: 0.9460
Epoch 154/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1728 - accuracy: 0.9517 - val_loss: 0.1990 - val_accuracy: 0.9453
Epoch 155/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1724 - accuracy: 0.9517 - val_loss: 0.1987 - val_accuracy: 0.9463
Epoch 156/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1719 - accuracy: 0.9519 - val_loss: 0.2001 - val_accuracy: 0.9451
Epoch 157/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1718 - accuracy: 0.9516 - val_loss: 0.1994 - val_accuracy: 0.9454
Epoch 158/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1713 - accuracy: 0.9521 - val_loss: 0.1991 - val_accuracy: 0.9451
Epoch 159/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1710 - accuracy: 0.9523 - val_loss: 0.1990 - val_accuracy: 0.9451
Epoch 160/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1707 - accuracy: 0.9519 - val_loss: 0.1986 - val_accuracy: 0.9462
Epoch 161/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1703 - accuracy: 0.9527 - val_loss: 0.1986 - val_accuracy: 0.9462
Epoch 162/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1699 - accuracy: 0.9520 - val_loss: 0.1983 - val_accuracy: 0.9466
Epoch 163/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1696 - accuracy: 0.9522 - val_loss: 0.1979 - val_accuracy: 0.9465
Epoch 164/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1692 - accuracy: 0.9528 - val_loss: 0.1981 - val_accuracy: 0.9461
Epoch 165/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1690 - accuracy: 0.9523 - val_loss: 0.1977 - val_accuracy: 0.9464
Epoch 166/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1685 - accuracy: 0.9526 - val_loss: 0.1979 - val_accuracy: 0.9473
Epoch 167/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1682 - accuracy: 0.9528 - val_loss: 0.1990 - val_accuracy: 0.9463
Epoch 168/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1680 - accuracy: 0.9529 - val_loss: 0.1972 - val_accuracy: 0.9463
Epoch 169/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1676 - accuracy: 0.9529 - val_loss: 0.1976 - val_accuracy: 0.9460
Epoch 170/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1672 - accuracy: 0.9532 - val_loss: 0.1978 - val_accuracy: 0.9457
Epoch 171/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1668 - accuracy: 0.9532 - val_loss: 0.1970 - val_accuracy: 0.9470
Epoch 172/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1667 - accuracy: 0.9529 - val_loss: 0.1972 - val_accuracy: 0.9470
Epoch 173/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1664 - accuracy: 0.9532 - val_loss: 0.1974 - val_accuracy: 0.9463
Epoch 174/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1659 - accuracy: 0.9532 - val_loss: 0.1969 - val_accuracy: 0.9466
Epoch 175/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1657 - accuracy: 0.9535 - val_loss: 0.1977 - val_accuracy: 0.9459
Epoch 176/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1654 - accuracy: 0.9537 - val_loss: 0.1963 - val_accuracy: 0.9466
Epoch 177/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1651 - accuracy: 0.9534 - val_loss: 0.1963 - val_accuracy: 0.9469
Epoch 178/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1647 - accuracy: 0.9541 - val_loss: 0.1963 - val_accuracy: 0.9464
Epoch 179/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1644 - accuracy: 0.9539 - val_loss: 0.1963 - val_accuracy: 0.9468
Epoch 180/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1640 - accuracy: 0.9544 - val_loss: 0.1957 - val_accuracy: 0.9471
Epoch 181/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1639 - accuracy: 0.9535 - val_loss: 0.1961 - val_accuracy: 0.9467
Epoch 182/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1635 - accuracy: 0.9541 - val_loss: 0.1961 - val_accuracy: 0.9467
Epoch 183/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1632 - accuracy: 0.9542 - val_loss: 0.1961 - val_accuracy: 0.9457
Epoch 184/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1629 - accuracy: 0.9542 - val_loss: 0.1957 - val_accuracy: 0.9465
Epoch 185/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1627 - accuracy: 0.9539 - val_loss: 0.1964 - val_accuracy: 0.9460
Epoch 186/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1625 - accuracy: 0.9542 - val_loss: 0.1960 - val_accuracy: 0.9463
Epoch 187/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1620 - accuracy: 0.9541 - val_loss: 0.1963 - val_accuracy: 0.9463
Epoch 188/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1618 - accuracy: 0.9543 - val_loss: 0.1953 - val_accuracy: 0.9469
Epoch 189/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1615 - accuracy: 0.9544 - val_loss: 0.1955 - val_accuracy: 0.9470
Epoch 190/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1612 - accuracy: 0.9542 - val_loss: 0.1968 - val_accuracy: 0.9461
Epoch 191/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1610 - accuracy: 0.9547 - val_loss: 0.1958 - val_accuracy: 0.9466
Epoch 192/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1608 - accuracy: 0.9548 - val_loss: 0.1952 - val_accuracy: 0.9473
Epoch 193/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1605 - accuracy: 0.9547 - val_loss: 0.1958 - val_accuracy: 0.9467
Epoch 194/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1601 - accuracy: 0.9544 - val_loss: 0.1951 - val_accuracy: 0.9463
Epoch 195/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1601 - accuracy: 0.9548 - val_loss: 0.1955 - val_accuracy: 0.9469
Epoch 196/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1597 - accuracy: 0.9549 - val_loss: 0.1949 - val_accuracy: 0.9465
Epoch 197/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1593 - accuracy: 0.9548 - val_loss: 0.1948 - val_accuracy: 0.9467
Epoch 198/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1593 - accuracy: 0.9549 - val_loss: 0.1949 - val_accuracy: 0.9470
Epoch 199/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1589 - accuracy: 0.9550 - val_loss: 0.1953 - val_accuracy: 0.9465
Epoch 200/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1587 - accuracy: 0.9554 - val_loss: 0.1949 - val_accuracy: 0.9465
Epoch 201/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1584 - accuracy: 0.9552 - val_loss: 0.1946 - val_accuracy: 0.9472
Epoch 202/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1581 - accuracy: 0.9558 - val_loss: 0.1952 - val_accuracy: 0.9467
Epoch 203/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1579 - accuracy: 0.9553 - val_loss: 0.1949 - val_accuracy: 0.9463
Epoch 204/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1576 - accuracy: 0.9556 - val_loss: 0.1943 - val_accuracy: 0.9472
Epoch 205/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1574 - accuracy: 0.9555 - val_loss: 0.1947 - val_accuracy: 0.9470
Epoch 206/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1572 - accuracy: 0.9555 - val_loss: 0.1946 - val_accuracy: 0.9467
Epoch 207/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1569 - accuracy: 0.9557 - val_loss: 0.1953 - val_accuracy: 0.9466
Epoch 208/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1568 - accuracy: 0.9555 - val_loss: 0.1945 - val_accuracy: 0.9470
Epoch 209/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1565 - accuracy: 0.9560 - val_loss: 0.1944 - val_accuracy: 0.9469
Epoch 210/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1561 - accuracy: 0.9561 - val_loss: 0.1945 - val_accuracy: 0.9468
Epoch 211/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1560 - accuracy: 0.9560 - val_loss: 0.1946 - val_accuracy: 0.9470
Epoch 212/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1558 - accuracy: 0.9560 - val_loss: 0.1948 - val_accuracy: 0.9467
Epoch 213/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1556 - accuracy: 0.9564 - val_loss: 0.1938 - val_accuracy: 0.9467
Epoch 214/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1554 - accuracy: 0.9562 - val_loss: 0.1950 - val_accuracy: 0.9467
Epoch 215/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1553 - accuracy: 0.9563 - val_loss: 0.1938 - val_accuracy: 0.9467
Epoch 216/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1549 - accuracy: 0.9560 - val_loss: 0.1939 - val_accuracy: 0.9463
Epoch 217/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1547 - accuracy: 0.9566 - val_loss: 0.1938 - val_accuracy: 0.9463
Epoch 218/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1544 - accuracy: 0.9563 - val_loss: 0.1943 - val_accuracy: 0.9464
Epoch 219/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1543 - accuracy: 0.9566 - val_loss: 0.1940 - val_accuracy: 0.9476
Epoch 220/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1539 - accuracy: 0.9565 - val_loss: 0.1942 - val_accuracy: 0.9468
Epoch 221/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1538 - accuracy: 0.9566 - val_loss: 0.1938 - val_accuracy: 0.9464
Epoch 222/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1537 - accuracy: 0.9565 - val_loss: 0.1936 - val_accuracy: 0.9471
Epoch 223/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1533 - accuracy: 0.9566 - val_loss: 0.1942 - val_accuracy: 0.9467
Epoch 224/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1532 - accuracy: 0.9566 - val_loss: 0.1939 - val_accuracy: 0.9473
Epoch 225/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1530 - accuracy: 0.9567 - val_loss: 0.1947 - val_accuracy: 0.9472
Epoch 226/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1528 - accuracy: 0.9568 - val_loss: 0.1937 - val_accuracy: 0.9468
Epoch 227/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1526 - accuracy: 0.9570 - val_loss: 0.1934 - val_accuracy: 0.9473
Epoch 228/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1523 - accuracy: 0.9570 - val_loss: 0.1940 - val_accuracy: 0.9468
Epoch 229/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1522 - accuracy: 0.9569 - val_loss: 0.1935 - val_accuracy: 0.9468
Epoch 230/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1519 - accuracy: 0.9572 - val_loss: 0.1947 - val_accuracy: 0.9463
Epoch 231/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1518 - accuracy: 0.9575 - val_loss: 0.1938 - val_accuracy: 0.9466
Epoch 232/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1517 - accuracy: 0.9571 - val_loss: 0.1940 - val_accuracy: 0.9468
Epoch 233/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1514 - accuracy: 0.9571 - val_loss: 0.1940 - val_accuracy: 0.9461
Epoch 234/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1511 - accuracy: 0.9575 - val_loss: 0.1938 - val_accuracy: 0.9468
Epoch 235/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1510 - accuracy: 0.9573 - val_loss: 0.1935 - val_accuracy: 0.9463
Epoch 236/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1506 - accuracy: 0.9573 - val_loss: 0.1944 - val_accuracy: 0.9462
Epoch 237/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1506 - accuracy: 0.9575 - val_loss: 0.1932 - val_accuracy: 0.9470
Epoch 238/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1504 - accuracy: 0.9571 - val_loss: 0.1935 - val_accuracy: 0.9469
Epoch 239/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1503 - accuracy: 0.9573 - val_loss: 0.1934 - val_accuracy: 0.9472
Epoch 240/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1500 - accuracy: 0.9579 - val_loss: 0.1930 - val_accuracy: 0.9477
Epoch 241/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1499 - accuracy: 0.9579 - val_loss: 0.1933 - val_accuracy: 0.9473
Epoch 242/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1496 - accuracy: 0.9578 - val_loss: 0.1930 - val_accuracy: 0.9474
Epoch 243/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1495 - accuracy: 0.9580 - val_loss: 0.1937 - val_accuracy: 0.9477
Epoch 244/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1493 - accuracy: 0.9582 - val_loss: 0.1930 - val_accuracy: 0.9471
Epoch 245/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1491 - accuracy: 0.9581 - val_loss: 0.1934 - val_accuracy: 0.9466
Epoch 246/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1490 - accuracy: 0.9582 - val_loss: 0.1932 - val_accuracy: 0.9464
Epoch 247/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1488 - accuracy: 0.9579 - val_loss: 0.1937 - val_accuracy: 0.9475
Epoch 248/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1485 - accuracy: 0.9586 - val_loss: 0.1933 - val_accuracy: 0.9473
Epoch 249/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1484 - accuracy: 0.9582 - val_loss: 0.1933 - val_accuracy: 0.9467
Epoch 250/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1482 - accuracy: 0.9585 - val_loss: 0.1939 - val_accuracy: 0.9477
Epoch 251/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1480 - accuracy: 0.9584 - val_loss: 0.1937 - val_accuracy: 0.9464
Epoch 252/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1479 - accuracy: 0.9584 - val_loss: 0.1939 - val_accuracy: 0.9471
Epoch 253/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1476 - accuracy: 0.9584 - val_loss: 0.1931 - val_accuracy: 0.9475
Epoch 254/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1475 - accuracy: 0.9585 - val_loss: 0.1934 - val_accuracy: 0.9462
Epoch 255/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1472 - accuracy: 0.9582 - val_loss: 0.1934 - val_accuracy: 0.9473
Epoch 256/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1471 - accuracy: 0.9585 - val_loss: 0.1933 - val_accuracy: 0.9479
Epoch 257/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1470 - accuracy: 0.9585 - val_loss: 0.1938 - val_accuracy: 0.9475
Epoch 258/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1468 - accuracy: 0.9585 - val_loss: 0.1937 - val_accuracy: 0.9473
Epoch 259/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1465 - accuracy: 0.9586 - val_loss: 0.1935 - val_accuracy: 0.9467
Epoch 260/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1464 - accuracy: 0.9589 - val_loss: 0.1932 - val_accuracy: 0.9473
Epoch 261/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1462 - accuracy: 0.9590 - val_loss: 0.1932 - val_accuracy: 0.9471
Epoch 262/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1460 - accuracy: 0.9589 - val_loss: 0.1934 - val_accuracy: 0.9474
Epoch 263/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1460 - accuracy: 0.9588 - val_loss: 0.1932 - val_accuracy: 0.9467
Epoch 264/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1458 - accuracy: 0.9591 - val_loss: 0.1932 - val_accuracy: 0.9470
Epoch 265/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1455 - accuracy: 0.9594 - val_loss: 0.1936 - val_accuracy: 0.9471
Epoch 266/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1455 - accuracy: 0.9596 - val_loss: 0.1928 - val_accuracy: 0.9474
Epoch 267/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1454 - accuracy: 0.9592 - val_loss: 0.1926 - val_accuracy: 0.9470
Epoch 268/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1451 - accuracy: 0.9595 - val_loss: 0.1933 - val_accuracy: 0.9472
Epoch 269/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1450 - accuracy: 0.9592 - val_loss: 0.1938 - val_accuracy: 0.9472
Epoch 270/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1447 - accuracy: 0.9595 - val_loss: 0.1943 - val_accuracy: 0.9475
Epoch 271/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1447 - accuracy: 0.9592 - val_loss: 0.1932 - val_accuracy: 0.9477
Epoch 272/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1445 - accuracy: 0.9596 - val_loss: 0.1929 - val_accuracy: 0.9465
Epoch 273/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1443 - accuracy: 0.9598 - val_loss: 0.1935 - val_accuracy: 0.9463
Epoch 274/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1441 - accuracy: 0.9596 - val_loss: 0.1930 - val_accuracy: 0.9475
Epoch 275/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1441 - accuracy: 0.9594 - val_loss: 0.1931 - val_accuracy: 0.9470
Epoch 276/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1438 - accuracy: 0.9597 - val_loss: 0.1933 - val_accuracy: 0.9479
Epoch 277/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1437 - accuracy: 0.9594 - val_loss: 0.1934 - val_accuracy: 0.9474
Epoch 278/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1434 - accuracy: 0.9601 - val_loss: 0.1932 - val_accuracy: 0.9471
Epoch 279/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1434 - accuracy: 0.9598 - val_loss: 0.1934 - val_accuracy: 0.9473
Epoch 280/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1432 - accuracy: 0.9597 - val_loss: 0.1932 - val_accuracy: 0.9470
Epoch 281/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1430 - accuracy: 0.9599 - val_loss: 0.1932 - val_accuracy: 0.9470
Epoch 282/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1429 - accuracy: 0.9603 - val_loss: 0.1939 - val_accuracy: 0.9467
Epoch 283/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1428 - accuracy: 0.9598 - val_loss: 0.1936 - val_accuracy: 0.9476
Epoch 284/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1425 - accuracy: 0.9595 - val_loss: 0.1933 - val_accuracy: 0.9470
Epoch 285/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1426 - accuracy: 0.9598 - val_loss: 0.1932 - val_accuracy: 0.9471
Epoch 286/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1423 - accuracy: 0.9601 - val_loss: 0.1933 - val_accuracy: 0.9467
Epoch 287/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1420 - accuracy: 0.9601 - val_loss: 0.1929 - val_accuracy: 0.9469
Epoch 288/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1420 - accuracy: 0.9600 - val_loss: 0.1937 - val_accuracy: 0.9464
Epoch 289/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1418 - accuracy: 0.9601 - val_loss: 0.1933 - val_accuracy: 0.9465
Epoch 290/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1417 - accuracy: 0.9606 - val_loss: 0.1934 - val_accuracy: 0.9471
Epoch 291/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1415 - accuracy: 0.9603 - val_loss: 0.1932 - val_accuracy: 0.9474
Epoch 292/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1415 - accuracy: 0.9605 - val_loss: 0.1927 - val_accuracy: 0.9469
Epoch 293/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1413 - accuracy: 0.9603 - val_loss: 0.1931 - val_accuracy: 0.9467
Epoch 294/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1410 - accuracy: 0.9600 - val_loss: 0.1929 - val_accuracy: 0.9470
Epoch 295/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1408 - accuracy: 0.9606 - val_loss: 0.1928 - val_accuracy: 0.9473
Epoch 296/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1408 - accuracy: 0.9607 - val_loss: 0.1936 - val_accuracy: 0.9471
Epoch 297/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1407 - accuracy: 0.9608 - val_loss: 0.1929 - val_accuracy: 0.9474
Epoch 298/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1405 - accuracy: 0.9612 - val_loss: 0.1938 - val_accuracy: 0.9469
Epoch 299/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1403 - accuracy: 0.9607 - val_loss: 0.1932 - val_accuracy: 0.9467
Epoch 300/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1403 - accuracy: 0.9610 - val_loss: 0.1931 - val_accuracy: 0.9473
Epoch 301/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1402 - accuracy: 0.9612 - val_loss: 0.1931 - val_accuracy: 0.9469
Epoch 302/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1399 - accuracy: 0.9611 - val_loss: 0.1936 - val_accuracy: 0.9467
Epoch 303/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1399 - accuracy: 0.9608 - val_loss: 0.1930 - val_accuracy: 0.9467
Epoch 304/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1397 - accuracy: 0.9610 - val_loss: 0.1934 - val_accuracy: 0.9466
Epoch 305/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1396 - accuracy: 0.9609 - val_loss: 0.1933 - val_accuracy: 0.9467
Epoch 306/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1393 - accuracy: 0.9613 - val_loss: 0.1931 - val_accuracy: 0.9459
Epoch 307/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1394 - accuracy: 0.9610 - val_loss: 0.1938 - val_accuracy: 0.9468
Epoch 308/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1392 - accuracy: 0.9609 - val_loss: 0.1933 - val_accuracy: 0.9464
Epoch 309/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1390 - accuracy: 0.9610 - val_loss: 0.1932 - val_accuracy: 0.9472
Epoch 310/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1389 - accuracy: 0.9613 - val_loss: 0.1931 - val_accuracy: 0.9472
Epoch 311/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1387 - accuracy: 0.9615 - val_loss: 0.1930 - val_accuracy: 0.9471
Epoch 312/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1385 - accuracy: 0.9612 - val_loss: 0.1939 - val_accuracy: 0.9477
Epoch 313/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1385 - accuracy: 0.9610 - val_loss: 0.1932 - val_accuracy: 0.9466
Epoch 314/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1384 - accuracy: 0.9617 - val_loss: 0.1928 - val_accuracy: 0.9468
Epoch 315/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1381 - accuracy: 0.9613 - val_loss: 0.1934 - val_accuracy: 0.9467
Epoch 316/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1382 - accuracy: 0.9613 - val_loss: 0.1931 - val_accuracy: 0.9466
Epoch 317/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1378 - accuracy: 0.9614 - val_loss: 0.1928 - val_accuracy: 0.9472
Epoch 318/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1377 - accuracy: 0.9613 - val_loss: 0.1937 - val_accuracy: 0.9465
Epoch 319/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1376 - accuracy: 0.9614 - val_loss: 0.1930 - val_accuracy: 0.9477
Epoch 320/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1375 - accuracy: 0.9614 - val_loss: 0.1928 - val_accuracy: 0.9467
Epoch 321/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1372 - accuracy: 0.9614 - val_loss: 0.1940 - val_accuracy: 0.9454
Epoch 322/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1372 - accuracy: 0.9619 - val_loss: 0.1932 - val_accuracy: 0.9463
Epoch 323/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1372 - accuracy: 0.9616 - val_loss: 0.1936 - val_accuracy: 0.9463
Epoch 324/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1371 - accuracy: 0.9619 - val_loss: 0.1931 - val_accuracy: 0.9466
Epoch 325/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1369 - accuracy: 0.9614 - val_loss: 0.1933 - val_accuracy: 0.9463
Epoch 326/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1368 - accuracy: 0.9621 - val_loss: 0.1930 - val_accuracy: 0.9465
Epoch 327/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1366 - accuracy: 0.9617 - val_loss: 0.1933 - val_accuracy: 0.9465
Epoch 328/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1364 - accuracy: 0.9618 - val_loss: 0.1933 - val_accuracy: 0.9474
Epoch 329/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1364 - accuracy: 0.9617 - val_loss: 0.1931 - val_accuracy: 0.9468
Epoch 330/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1363 - accuracy: 0.9620 - val_loss: 0.1932 - val_accuracy: 0.9466
Epoch 331/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1360 - accuracy: 0.9621 - val_loss: 0.1940 - val_accuracy: 0.9464
Epoch 332/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1361 - accuracy: 0.9619 - val_loss: 0.1931 - val_accuracy: 0.9469
Epoch 333/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1357 - accuracy: 0.9626 - val_loss: 0.1933 - val_accuracy: 0.9470
Epoch 334/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1358 - accuracy: 0.9622 - val_loss: 0.1934 - val_accuracy: 0.9467
Epoch 335/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1356 - accuracy: 0.9619 - val_loss: 0.1929 - val_accuracy: 0.9466
Epoch 336/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1354 - accuracy: 0.9623 - val_loss: 0.1936 - val_accuracy: 0.9467
Epoch 337/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1354 - accuracy: 0.9621 - val_loss: 0.1931 - val_accuracy: 0.9467
Epoch 338/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1352 - accuracy: 0.9624 - val_loss: 0.1932 - val_accuracy: 0.9471
Epoch 339/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1352 - accuracy: 0.9623 - val_loss: 0.1930 - val_accuracy: 0.9464
Epoch 340/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1350 - accuracy: 0.9626 - val_loss: 0.1931 - val_accuracy: 0.9467
Epoch 341/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1348 - accuracy: 0.9628 - val_loss: 0.1937 - val_accuracy: 0.9466
Epoch 342/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1348 - accuracy: 0.9623 - val_loss: 0.1930 - val_accuracy: 0.9473
Epoch 343/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1346 - accuracy: 0.9625 - val_loss: 0.1939 - val_accuracy: 0.9465
Epoch 344/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1346 - accuracy: 0.9623 - val_loss: 0.1931 - val_accuracy: 0.9472
Epoch 345/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1344 - accuracy: 0.9624 - val_loss: 0.1935 - val_accuracy: 0.9463
Epoch 346/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1342 - accuracy: 0.9626 - val_loss: 0.1934 - val_accuracy: 0.9467
Epoch 347/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1342 - accuracy: 0.9624 - val_loss: 0.1934 - val_accuracy: 0.9465
Epoch 348/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1338 - accuracy: 0.9624 - val_loss: 0.1936 - val_accuracy: 0.9459
Epoch 349/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1338 - accuracy: 0.9627 - val_loss: 0.1940 - val_accuracy: 0.9465
Epoch 350/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1336 - accuracy: 0.9626 - val_loss: 0.1927 - val_accuracy: 0.9464
Epoch 351/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1336 - accuracy: 0.9625 - val_loss: 0.1933 - val_accuracy: 0.9468
Epoch 352/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1336 - accuracy: 0.9626 - val_loss: 0.1935 - val_accuracy: 0.9467
Epoch 353/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1335 - accuracy: 0.9631 - val_loss: 0.1935 - val_accuracy: 0.9468
Epoch 354/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1333 - accuracy: 0.9631 - val_loss: 0.1931 - val_accuracy: 0.9469
Epoch 355/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1332 - accuracy: 0.9622 - val_loss: 0.1940 - val_accuracy: 0.9467
Epoch 356/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1329 - accuracy: 0.9626 - val_loss: 0.1936 - val_accuracy: 0.9461
Epoch 357/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1329 - accuracy: 0.9627 - val_loss: 0.1933 - val_accuracy: 0.9468
Epoch 358/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1328 - accuracy: 0.9631 - val_loss: 0.1933 - val_accuracy: 0.9464
Epoch 359/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1327 - accuracy: 0.9626 - val_loss: 0.1944 - val_accuracy: 0.9458
Epoch 360/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1327 - accuracy: 0.9628 - val_loss: 0.1938 - val_accuracy: 0.9466
Epoch 361/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1323 - accuracy: 0.9627 - val_loss: 0.1936 - val_accuracy: 0.9461
Epoch 362/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1323 - accuracy: 0.9630 - val_loss: 0.1938 - val_accuracy: 0.9460
Epoch 363/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1324 - accuracy: 0.9625 - val_loss: 0.1933 - val_accuracy: 0.9463
Epoch 364/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1321 - accuracy: 0.9632 - val_loss: 0.1933 - val_accuracy: 0.9468
Epoch 365/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1321 - accuracy: 0.9631 - val_loss: 0.1935 - val_accuracy: 0.9463
Epoch 366/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1318 - accuracy: 0.9627 - val_loss: 0.1932 - val_accuracy: 0.9468
Epoch 367/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1317 - accuracy: 0.9631 - val_loss: 0.1931 - val_accuracy: 0.9463
Epoch 368/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1317 - accuracy: 0.9634 - val_loss: 0.1940 - val_accuracy: 0.9460
Epoch 369/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1316 - accuracy: 0.9632 - val_loss: 0.1934 - val_accuracy: 0.9463
Epoch 370/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1314 - accuracy: 0.9636 - val_loss: 0.1931 - val_accuracy: 0.9468
Epoch 371/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1314 - accuracy: 0.9629 - val_loss: 0.1934 - val_accuracy: 0.9466
Epoch 372/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1312 - accuracy: 0.9634 - val_loss: 0.1942 - val_accuracy: 0.9452
Epoch 373/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1311 - accuracy: 0.9632 - val_loss: 0.1938 - val_accuracy: 0.9467
Epoch 374/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1310 - accuracy: 0.9634 - val_loss: 0.1935 - val_accuracy: 0.9476
Epoch 375/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1308 - accuracy: 0.9633 - val_loss: 0.1933 - val_accuracy: 0.9464
Epoch 376/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1308 - accuracy: 0.9633 - val_loss: 0.1936 - val_accuracy: 0.9461
Epoch 377/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1305 - accuracy: 0.9632 - val_loss: 0.1940 - val_accuracy: 0.9457
Epoch 378/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1307 - accuracy: 0.9635 - val_loss: 0.1936 - val_accuracy: 0.9457
Epoch 379/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1303 - accuracy: 0.9633 - val_loss: 0.1935 - val_accuracy: 0.9453
Epoch 380/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1304 - accuracy: 0.9629 - val_loss: 0.1933 - val_accuracy: 0.9459
Epoch 381/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1301 - accuracy: 0.9636 - val_loss: 0.1936 - val_accuracy: 0.9466
Epoch 382/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1302 - accuracy: 0.9637 - val_loss: 0.1938 - val_accuracy: 0.9459
Epoch 383/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1300 - accuracy: 0.9638 - val_loss: 0.1940 - val_accuracy: 0.9457
Epoch 384/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1299 - accuracy: 0.9638 - val_loss: 0.1945 - val_accuracy: 0.9463
Epoch 385/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1298 - accuracy: 0.9636 - val_loss: 0.1937 - val_accuracy: 0.9456
Epoch 386/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1299 - accuracy: 0.9635 - val_loss: 0.1937 - val_accuracy: 0.9463
Epoch 387/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1297 - accuracy: 0.9637 - val_loss: 0.1935 - val_accuracy: 0.9467
Epoch 388/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1295 - accuracy: 0.9635 - val_loss: 0.1940 - val_accuracy: 0.9460
Epoch 389/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1292 - accuracy: 0.9637 - val_loss: 0.1936 - val_accuracy: 0.9467
Epoch 390/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1292 - accuracy: 0.9634 - val_loss: 0.1940 - val_accuracy: 0.9460
Epoch 391/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1291 - accuracy: 0.9640 - val_loss: 0.1936 - val_accuracy: 0.9465
Epoch 392/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1291 - accuracy: 0.9639 - val_loss: 0.1938 - val_accuracy: 0.9466
Epoch 393/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1289 - accuracy: 0.9639 - val_loss: 0.1938 - val_accuracy: 0.9459
Epoch 394/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1289 - accuracy: 0.9636 - val_loss: 0.1934 - val_accuracy: 0.9466
Epoch 395/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1288 - accuracy: 0.9644 - val_loss: 0.1933 - val_accuracy: 0.9465
Epoch 396/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1285 - accuracy: 0.9646 - val_loss: 0.1938 - val_accuracy: 0.9472
Epoch 397/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1285 - accuracy: 0.9638 - val_loss: 0.1939 - val_accuracy: 0.9463
Epoch 398/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1284 - accuracy: 0.9643 - val_loss: 0.1945 - val_accuracy: 0.9462
Epoch 399/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1283 - accuracy: 0.9639 - val_loss: 0.1941 - val_accuracy: 0.9447
Epoch 400/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1284 - accuracy: 0.9640 - val_loss: 0.1934 - val_accuracy: 0.9465
Epoch 401/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1281 - accuracy: 0.9640 - val_loss: 0.1938 - val_accuracy: 0.9458
Epoch 402/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1279 - accuracy: 0.9642 - val_loss: 0.1942 - val_accuracy: 0.9455
Epoch 403/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1279 - accuracy: 0.9638 - val_loss: 0.1935 - val_accuracy: 0.9460
Epoch 404/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1278 - accuracy: 0.9639 - val_loss: 0.1942 - val_accuracy: 0.9459
Epoch 405/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1278 - accuracy: 0.9644 - val_loss: 0.1938 - val_accuracy: 0.9465
Epoch 406/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1276 - accuracy: 0.9639 - val_loss: 0.1936 - val_accuracy: 0.9469
Epoch 407/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1275 - accuracy: 0.9642 - val_loss: 0.1935 - val_accuracy: 0.9465
Epoch 408/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1274 - accuracy: 0.9645 - val_loss: 0.1936 - val_accuracy: 0.9465
Epoch 409/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1271 - accuracy: 0.9640 - val_loss: 0.1959 - val_accuracy: 0.9452
Epoch 410/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1273 - accuracy: 0.9642 - val_loss: 0.1941 - val_accuracy: 0.9467
Epoch 411/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1270 - accuracy: 0.9644 - val_loss: 0.1940 - val_accuracy: 0.9457
Epoch 412/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1267 - accuracy: 0.9646 - val_loss: 0.1940 - val_accuracy: 0.9452
Epoch 413/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1270 - accuracy: 0.9642 - val_loss: 0.1937 - val_accuracy: 0.9458
Epoch 414/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1268 - accuracy: 0.9643 - val_loss: 0.1944 - val_accuracy: 0.9466
Epoch 415/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1267 - accuracy: 0.9645 - val_loss: 0.1941 - val_accuracy: 0.9461
Epoch 416/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1265 - accuracy: 0.9644 - val_loss: 0.1940 - val_accuracy: 0.9461
Epoch 417/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1265 - accuracy: 0.9648 - val_loss: 0.1943 - val_accuracy: 0.9463
Epoch 418/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1263 - accuracy: 0.9645 - val_loss: 0.1941 - val_accuracy: 0.9463
Epoch 419/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1262 - accuracy: 0.9645 - val_loss: 0.1937 - val_accuracy: 0.9457
Epoch 420/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1260 - accuracy: 0.9648 - val_loss: 0.1940 - val_accuracy: 0.9460
Epoch 421/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1260 - accuracy: 0.9647 - val_loss: 0.1940 - val_accuracy: 0.9463
Epoch 422/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1258 - accuracy: 0.9648 - val_loss: 0.1942 - val_accuracy: 0.9467
Epoch 423/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1259 - accuracy: 0.9649 - val_loss: 0.1936 - val_accuracy: 0.9467
Epoch 424/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1258 - accuracy: 0.9643 - val_loss: 0.1938 - val_accuracy: 0.9465
Epoch 425/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1257 - accuracy: 0.9649 - val_loss: 0.1942 - val_accuracy: 0.9472
Epoch 426/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1254 - accuracy: 0.9648 - val_loss: 0.1941 - val_accuracy: 0.9467
Epoch 427/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1254 - accuracy: 0.9648 - val_loss: 0.1946 - val_accuracy: 0.9462
Epoch 428/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1254 - accuracy: 0.9650 - val_loss: 0.1942 - val_accuracy: 0.9467
Epoch 429/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1253 - accuracy: 0.9648 - val_loss: 0.1941 - val_accuracy: 0.9467
Epoch 430/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1252 - accuracy: 0.9649 - val_loss: 0.1938 - val_accuracy: 0.9465
Epoch 431/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1250 - accuracy: 0.9647 - val_loss: 0.1940 - val_accuracy: 0.9460
Epoch 432/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1250 - accuracy: 0.9648 - val_loss: 0.1941 - val_accuracy: 0.9465
Epoch 433/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1247 - accuracy: 0.9650 - val_loss: 0.1945 - val_accuracy: 0.9454
Epoch 434/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1247 - accuracy: 0.9651 - val_loss: 0.1941 - val_accuracy: 0.9463
Epoch 435/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1246 - accuracy: 0.9650 - val_loss: 0.1944 - val_accuracy: 0.9459
Epoch 436/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1245 - accuracy: 0.9653 - val_loss: 0.1938 - val_accuracy: 0.9463
Epoch 437/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1244 - accuracy: 0.9652 - val_loss: 0.1944 - val_accuracy: 0.9466
Epoch 438/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1244 - accuracy: 0.9653 - val_loss: 0.1941 - val_accuracy: 0.9463
Epoch 439/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1243 - accuracy: 0.9653 - val_loss: 0.1940 - val_accuracy: 0.9467
Epoch 440/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1240 - accuracy: 0.9654 - val_loss: 0.1943 - val_accuracy: 0.9457
Epoch 441/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1241 - accuracy: 0.9653 - val_loss: 0.1944 - val_accuracy: 0.9461
Epoch 442/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1240 - accuracy: 0.9651 - val_loss: 0.1943 - val_accuracy: 0.9467
Epoch 443/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1239 - accuracy: 0.9653 - val_loss: 0.1943 - val_accuracy: 0.9460
Epoch 444/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1236 - accuracy: 0.9652 - val_loss: 0.1943 - val_accuracy: 0.9465
Epoch 445/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1236 - accuracy: 0.9652 - val_loss: 0.1944 - val_accuracy: 0.9459
Epoch 446/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1237 - accuracy: 0.9653 - val_loss: 0.1938 - val_accuracy: 0.9467
Epoch 447/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1235 - accuracy: 0.9657 - val_loss: 0.1944 - val_accuracy: 0.9467
Epoch 448/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1233 - accuracy: 0.9654 - val_loss: 0.1949 - val_accuracy: 0.9462
Epoch 449/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1233 - accuracy: 0.9658 - val_loss: 0.1943 - val_accuracy: 0.9461
Epoch 450/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1232 - accuracy: 0.9655 - val_loss: 0.1942 - val_accuracy: 0.9457
Epoch 451/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1231 - accuracy: 0.9655 - val_loss: 0.1945 - val_accuracy: 0.9468
Epoch 452/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1230 - accuracy: 0.9655 - val_loss: 0.1940 - val_accuracy: 0.9462
Epoch 453/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1229 - accuracy: 0.9652 - val_loss: 0.1940 - val_accuracy: 0.9456
Epoch 454/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1227 - accuracy: 0.9655 - val_loss: 0.1945 - val_accuracy: 0.9452
Epoch 455/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1228 - accuracy: 0.9658 - val_loss: 0.1941 - val_accuracy: 0.9463
Epoch 456/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1226 - accuracy: 0.9657 - val_loss: 0.1945 - val_accuracy: 0.9463
Epoch 457/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1224 - accuracy: 0.9654 - val_loss: 0.1948 - val_accuracy: 0.9452
Epoch 458/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1224 - accuracy: 0.9656 - val_loss: 0.1944 - val_accuracy: 0.9471
Epoch 459/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1223 - accuracy: 0.9656 - val_loss: 0.1943 - val_accuracy: 0.9461
Epoch 460/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1222 - accuracy: 0.9657 - val_loss: 0.1944 - val_accuracy: 0.9469
Epoch 461/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1222 - accuracy: 0.9659 - val_loss: 0.1945 - val_accuracy: 0.9464
Epoch 462/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1220 - accuracy: 0.9659 - val_loss: 0.1951 - val_accuracy: 0.9447
Epoch 463/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1220 - accuracy: 0.9659 - val_loss: 0.1944 - val_accuracy: 0.9466
Epoch 464/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1219 - accuracy: 0.9661 - val_loss: 0.1948 - val_accuracy: 0.9463
Epoch 465/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1218 - accuracy: 0.9662 - val_loss: 0.1947 - val_accuracy: 0.9467
Epoch 466/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1215 - accuracy: 0.9657 - val_loss: 0.1941 - val_accuracy: 0.9462
Epoch 467/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1216 - accuracy: 0.9660 - val_loss: 0.1950 - val_accuracy: 0.9465
Epoch 468/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1214 - accuracy: 0.9662 - val_loss: 0.1938 - val_accuracy: 0.9465
Epoch 469/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1213 - accuracy: 0.9661 - val_loss: 0.1958 - val_accuracy: 0.9467
Epoch 470/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1213 - accuracy: 0.9663 - val_loss: 0.1946 - val_accuracy: 0.9456
Epoch 471/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1212 - accuracy: 0.9658 - val_loss: 0.1947 - val_accuracy: 0.9462
Epoch 472/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1211 - accuracy: 0.9657 - val_loss: 0.1945 - val_accuracy: 0.9457
Epoch 473/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1210 - accuracy: 0.9657 - val_loss: 0.1943 - val_accuracy: 0.9468
Epoch 474/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1209 - accuracy: 0.9661 - val_loss: 0.1947 - val_accuracy: 0.9458
Epoch 475/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1210 - accuracy: 0.9660 - val_loss: 0.1945 - val_accuracy: 0.9473
Epoch 476/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1209 - accuracy: 0.9662 - val_loss: 0.1945 - val_accuracy: 0.9468
Epoch 477/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1207 - accuracy: 0.9660 - val_loss: 0.1946 - val_accuracy: 0.9464
Epoch 478/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1206 - accuracy: 0.9661 - val_loss: 0.1946 - val_accuracy: 0.9457
Epoch 479/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1207 - accuracy: 0.9663 - val_loss: 0.1948 - val_accuracy: 0.9465
Epoch 480/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1205 - accuracy: 0.9664 - val_loss: 0.1947 - val_accuracy: 0.9457
Epoch 481/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1203 - accuracy: 0.9663 - val_loss: 0.1948 - val_accuracy: 0.9463
Epoch 482/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1202 - accuracy: 0.9665 - val_loss: 0.1949 - val_accuracy: 0.9465
Epoch 483/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1201 - accuracy: 0.9665 - val_loss: 0.1950 - val_accuracy: 0.9458
Epoch 484/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1202 - accuracy: 0.9665 - val_loss: 0.1949 - val_accuracy: 0.9461
Epoch 485/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1201 - accuracy: 0.9658 - val_loss: 0.1952 - val_accuracy: 0.9463
Epoch 486/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1201 - accuracy: 0.9662 - val_loss: 0.1947 - val_accuracy: 0.9467
Epoch 487/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1198 - accuracy: 0.9665 - val_loss: 0.1950 - val_accuracy: 0.9461
Epoch 488/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1197 - accuracy: 0.9664 - val_loss: 0.1955 - val_accuracy: 0.9465
Epoch 489/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1197 - accuracy: 0.9666 - val_loss: 0.1947 - val_accuracy: 0.9465
Epoch 490/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1195 - accuracy: 0.9669 - val_loss: 0.1957 - val_accuracy: 0.9450
Epoch 491/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1195 - accuracy: 0.9664 - val_loss: 0.1949 - val_accuracy: 0.9466
Epoch 492/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1193 - accuracy: 0.9663 - val_loss: 0.1954 - val_accuracy: 0.9465
Epoch 493/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1192 - accuracy: 0.9668 - val_loss: 0.1961 - val_accuracy: 0.9451
Epoch 494/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1193 - accuracy: 0.9668 - val_loss: 0.1945 - val_accuracy: 0.9462
Epoch 495/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1190 - accuracy: 0.9669 - val_loss: 0.1950 - val_accuracy: 0.9458
Epoch 496/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1189 - accuracy: 0.9667 - val_loss: 0.1947 - val_accuracy: 0.9463
Epoch 497/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1189 - accuracy: 0.9668 - val_loss: 0.1946 - val_accuracy: 0.9469
Epoch 498/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1189 - accuracy: 0.9667 - val_loss: 0.1950 - val_accuracy: 0.9463
Epoch 499/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1189 - accuracy: 0.9665 - val_loss: 0.1951 - val_accuracy: 0.9467
Epoch 500/500
375/375 [==============================] - 1s 2ms/step - loss: 0.1186 - accuracy: 0.9668 - val_loss: 0.1951 - val_accuracy: 0.9474


The following command provides a list of performance data that Keras collected during the training. In this case we the training and validation loss and accuracy values, collected at the end of each training epoch.

In [23]:
history_dict = history.history
history_dict.keys()

Out[23]:
dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])

We can plot the performance data collected during the training process using matplotlib. The plots shown below are typical for the loss and accuracy values as a function of the number of epochs. These plots are extremely important in interpreting and/or debugging the model, and in later chapters we will explain how this is done. By comparing the training and validation curves, we can figure how well the model is generalizing beyond the training dataset.

In [24]:
import matplotlib.pyplot as plt

acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']

epochs = range(1, len(acc) + 1)
#epochs = range(1, len(loss) + 1)

# "bo" is for "blue dot"
plt.plot(epochs, loss, 'bo', label='Training loss')
# b is for "solid blue line"
plt.plot(epochs, val_loss, 'b', label='Validation loss')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()

plt.show()

In [25]:
plt.clf()   # clear figure
acc_values = history_dict['accuracy']
val_acc_values = history_dict['val_accuracy']

plt.plot(epochs, acc, 'bo', label='Training acc')
plt.plot(epochs, val_acc, 'b', label='Validation acc')
plt.title('Training and validation accuracy')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()

plt.show()


Once we are satisfied with the training results, we can run the test data through the model using the evaluate command.

In [26]:
test_loss, test_acc = network.evaluate(test_images, test_labels)

313/313 [==============================] - 0s 940us/step - loss: 0.1777 - accuracy: 0.9493

In [27]:
print('test_acc:', test_acc)

test_acc: 0.9492999911308289


The summary command provides a useful overview of the model, it lists all the layers, the shape of the output tensor as well as the number of parameters per layer.

In [28]:
network.summary()

Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
dense (Dense)                (None, 15)                11775
_________________________________________________________________
dense_1 (Dense)              (None, 10)                160
=================================================================
Total params: 11,935
Trainable params: 11,935
Non-trainable params: 0
_________________________________________________________________

In [ ]: