Keras and the Last Number Problem#
Let’s see if we can do better than our simple hidden layer NN with the last number problem.
import numpy as np
import keras
from keras.utils import to_categorical
We’ll use the same data class
class ModelDataCategorical:
"""this is the model data for our "last number" training set. We
produce input of length N, consisting of numbers 0-9 and store
the result in a 10-element array as categorical data.
"""
def __init__(self, N=10):
self.N = N
# our model input data
self.x = np.random.randint(0, high=10, size=N)
self.x_scaled = self.x / 10 + 0.05
# our scaled model output data
self.y = np.array([self.x[-1]])
self.y_scaled = np.zeros(10) + 0.01
self.y_scaled[self.x[-1]] = 0.99
def interpret_result(self, out):
"""take the network output and return the number we predict"""
return np.argmax(out)
For Keras, we need to pack the scaled data (both input and output) into arrays. We’ll use
the Keras to_categorical() to make the data categorical.
Let’s make both a training set and a test set
x_train = []
y_train = []
for _ in range(10000):
m = ModelDataCategorical()
x_train.append(m.x_scaled)
y_train.append(m.y)
x_train = np.asarray(x_train)
y_train = to_categorical(y_train, 10)
x_test = []
y_test = []
for _ in range(1000):
m = ModelDataCategorical()
x_test.append(m.x_scaled)
y_test.append(m.y)
x_test = np.asarray(x_test)
y_test = to_categorical(y_test, 10)
Check to make sure the data looks like we expect:
x_train[0]
array([0.25, 0.85, 0.95, 0.45, 0.05, 0.35, 0.85, 0.25, 0.55, 0.25])
y_train[0]
array([0., 0., 1., 0., 0., 0., 0., 0., 0., 0.])
Creating the network#
Now let’s build our network. We’ll use just a single hidden layer, but instead of the sigmoid used before, we’ll use RELU and the softmax activations.
from keras.models import Sequential
from keras.layers import Input, Dense, Dropout, Activation
from keras.optimizers import RMSprop
model = Sequential()
model.add(Input((10,)))
model.add(Dense(100, activation="relu"))
model.add(Dropout(0.1))
model.add(Dense(10, activation="softmax"))
rms = RMSprop()
model.compile(loss='categorical_crossentropy',
optimizer=rms, metrics=['accuracy'])
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 100) │ 1,100 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout (Dropout) │ (None, 100) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 10) │ 1,010 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 2,110 (8.24 KB)
Trainable params: 2,110 (8.24 KB)
Non-trainable params: 0 (0.00 B)
Now we have ~ 2k parameters to fit.
Training#
Now we can train and test each epoch to see how we do
epochs = 100
batch_size = 256
model.fit(x_train, y_train, epochs=epochs, batch_size=batch_size,
validation_data=(x_test, y_test), verbose=2)
Epoch 1/100
40/40 - 0s - 6ms/step - accuracy: 0.1633 - loss: 2.2587 - val_accuracy: 0.1810 - val_loss: 2.2096
Epoch 2/100
40/40 - 0s - 7ms/step - accuracy: 0.2092 - loss: 2.1616 - val_accuracy: 0.2290 - val_loss: 2.1183
Epoch 3/100
40/40 - 0s - 8ms/step - accuracy: 0.2506 - loss: 2.0603 - val_accuracy: 0.2560 - val_loss: 2.0232
Epoch 4/100
40/40 - 0s - 5ms/step - accuracy: 0.2739 - loss: 1.9646 - val_accuracy: 0.2290 - val_loss: 1.9344
Epoch 5/100
40/40 - 0s - 5ms/step - accuracy: 0.2987 - loss: 1.8733 - val_accuracy: 0.3020 - val_loss: 1.8490
Epoch 6/100
40/40 - 0s - 7ms/step - accuracy: 0.3259 - loss: 1.7904 - val_accuracy: 0.3680 - val_loss: 1.7661
Epoch 7/100
40/40 - 0s - 6ms/step - accuracy: 0.3549 - loss: 1.7124 - val_accuracy: 0.3630 - val_loss: 1.6952
Epoch 8/100
40/40 - 0s - 7ms/step - accuracy: 0.3887 - loss: 1.6415 - val_accuracy: 0.4710 - val_loss: 1.6215
Epoch 9/100
40/40 - 0s - 5ms/step - accuracy: 0.4160 - loss: 1.5783 - val_accuracy: 0.3780 - val_loss: 1.5660
Epoch 10/100
40/40 - 0s - 5ms/step - accuracy: 0.4341 - loss: 1.5179 - val_accuracy: 0.4560 - val_loss: 1.5075
Epoch 11/100
40/40 - 0s - 7ms/step - accuracy: 0.4750 - loss: 1.4636 - val_accuracy: 0.4850 - val_loss: 1.4570
Epoch 12/100
40/40 - 0s - 8ms/step - accuracy: 0.4868 - loss: 1.4141 - val_accuracy: 0.5180 - val_loss: 1.4066
Epoch 13/100
40/40 - 0s - 5ms/step - accuracy: 0.5161 - loss: 1.3653 - val_accuracy: 0.5420 - val_loss: 1.3590
Epoch 14/100
40/40 - 0s - 5ms/step - accuracy: 0.5310 - loss: 1.3226 - val_accuracy: 0.5860 - val_loss: 1.3150
Epoch 15/100
40/40 - 0s - 10ms/step - accuracy: 0.5494 - loss: 1.2811 - val_accuracy: 0.6130 - val_loss: 1.2725
Epoch 16/100
40/40 - 0s - 6ms/step - accuracy: 0.5762 - loss: 1.2440 - val_accuracy: 0.6240 - val_loss: 1.2374
Epoch 17/100
40/40 - 0s - 7ms/step - accuracy: 0.5907 - loss: 1.2044 - val_accuracy: 0.6460 - val_loss: 1.2020
Epoch 18/100
40/40 - 0s - 5ms/step - accuracy: 0.6071 - loss: 1.1710 - val_accuracy: 0.6360 - val_loss: 1.1659
Epoch 19/100
40/40 - 0s - 5ms/step - accuracy: 0.6210 - loss: 1.1404 - val_accuracy: 0.7240 - val_loss: 1.1314
Epoch 20/100
40/40 - 0s - 8ms/step - accuracy: 0.6440 - loss: 1.1062 - val_accuracy: 0.6100 - val_loss: 1.1091
Epoch 21/100
40/40 - 0s - 8ms/step - accuracy: 0.6573 - loss: 1.0767 - val_accuracy: 0.7320 - val_loss: 1.0762
Epoch 22/100
40/40 - 0s - 5ms/step - accuracy: 0.6726 - loss: 1.0524 - val_accuracy: 0.6970 - val_loss: 1.0506
Epoch 23/100
40/40 - 0s - 5ms/step - accuracy: 0.6934 - loss: 1.0212 - val_accuracy: 0.7860 - val_loss: 1.0125
Epoch 24/100
40/40 - 0s - 8ms/step - accuracy: 0.7042 - loss: 0.9943 - val_accuracy: 0.7670 - val_loss: 0.9928
Epoch 25/100
40/40 - 0s - 6ms/step - accuracy: 0.7214 - loss: 0.9710 - val_accuracy: 0.7990 - val_loss: 0.9680
Epoch 26/100
40/40 - 0s - 7ms/step - accuracy: 0.7410 - loss: 0.9449 - val_accuracy: 0.7550 - val_loss: 0.9462
Epoch 27/100
40/40 - 0s - 5ms/step - accuracy: 0.7498 - loss: 0.9207 - val_accuracy: 0.8300 - val_loss: 0.9192
Epoch 28/100
40/40 - 0s - 5ms/step - accuracy: 0.7685 - loss: 0.8971 - val_accuracy: 0.8600 - val_loss: 0.8924
Epoch 29/100
40/40 - 0s - 8ms/step - accuracy: 0.7798 - loss: 0.8770 - val_accuracy: 0.8840 - val_loss: 0.8696
Epoch 30/100
40/40 - 0s - 6ms/step - accuracy: 0.8049 - loss: 0.8500 - val_accuracy: 0.8690 - val_loss: 0.8510
Epoch 31/100
40/40 - 0s - 7ms/step - accuracy: 0.8152 - loss: 0.8304 - val_accuracy: 0.8930 - val_loss: 0.8247
Epoch 32/100
40/40 - 0s - 5ms/step - accuracy: 0.8382 - loss: 0.8061 - val_accuracy: 0.9210 - val_loss: 0.7995
Epoch 33/100
40/40 - 0s - 8ms/step - accuracy: 0.8474 - loss: 0.7865 - val_accuracy: 0.9270 - val_loss: 0.7767
Epoch 34/100
40/40 - 0s - 6ms/step - accuracy: 0.8565 - loss: 0.7663 - val_accuracy: 0.8790 - val_loss: 0.7679
Epoch 35/100
40/40 - 0s - 7ms/step - accuracy: 0.8674 - loss: 0.7458 - val_accuracy: 0.9140 - val_loss: 0.7458
Epoch 36/100
40/40 - 0s - 5ms/step - accuracy: 0.8809 - loss: 0.7258 - val_accuracy: 0.8810 - val_loss: 0.7316
Epoch 37/100
40/40 - 0s - 5ms/step - accuracy: 0.8863 - loss: 0.7063 - val_accuracy: 0.9280 - val_loss: 0.7092
Epoch 38/100
40/40 - 0s - 8ms/step - accuracy: 0.9017 - loss: 0.6868 - val_accuracy: 0.9600 - val_loss: 0.6834
Epoch 39/100
40/40 - 0s - 6ms/step - accuracy: 0.9073 - loss: 0.6715 - val_accuracy: 0.9520 - val_loss: 0.6658
Epoch 40/100
40/40 - 0s - 7ms/step - accuracy: 0.9172 - loss: 0.6525 - val_accuracy: 0.9670 - val_loss: 0.6563
Epoch 41/100
40/40 - 0s - 5ms/step - accuracy: 0.9267 - loss: 0.6339 - val_accuracy: 0.9670 - val_loss: 0.6307
Epoch 42/100
40/40 - 0s - 8ms/step - accuracy: 0.9331 - loss: 0.6168 - val_accuracy: 0.9820 - val_loss: 0.6102
Epoch 43/100
40/40 - 0s - 6ms/step - accuracy: 0.9396 - loss: 0.6010 - val_accuracy: 0.9880 - val_loss: 0.5923
Epoch 44/100
40/40 - 0s - 7ms/step - accuracy: 0.9466 - loss: 0.5833 - val_accuracy: 0.9900 - val_loss: 0.5756
Epoch 45/100
40/40 - 0s - 5ms/step - accuracy: 0.9572 - loss: 0.5664 - val_accuracy: 0.9940 - val_loss: 0.5658
Epoch 46/100
40/40 - 0s - 5ms/step - accuracy: 0.9587 - loss: 0.5504 - val_accuracy: 0.9710 - val_loss: 0.5522
Epoch 47/100
40/40 - 0s - 8ms/step - accuracy: 0.9658 - loss: 0.5331 - val_accuracy: 0.9840 - val_loss: 0.5312
Epoch 48/100
40/40 - 0s - 6ms/step - accuracy: 0.9696 - loss: 0.5184 - val_accuracy: 0.9940 - val_loss: 0.5154
Epoch 49/100
40/40 - 0s - 7ms/step - accuracy: 0.9702 - loss: 0.5042 - val_accuracy: 0.9970 - val_loss: 0.4945
Epoch 50/100
40/40 - 0s - 5ms/step - accuracy: 0.9727 - loss: 0.4897 - val_accuracy: 0.9910 - val_loss: 0.4852
Epoch 51/100
40/40 - 0s - 5ms/step - accuracy: 0.9782 - loss: 0.4739 - val_accuracy: 0.9970 - val_loss: 0.4696
Epoch 52/100
40/40 - 0s - 8ms/step - accuracy: 0.9766 - loss: 0.4620 - val_accuracy: 1.0000 - val_loss: 0.4516
Epoch 53/100
40/40 - 0s - 8ms/step - accuracy: 0.9783 - loss: 0.4469 - val_accuracy: 1.0000 - val_loss: 0.4379
Epoch 54/100
40/40 - 0s - 5ms/step - accuracy: 0.9827 - loss: 0.4342 - val_accuracy: 1.0000 - val_loss: 0.4283
Epoch 55/100
40/40 - 0s - 5ms/step - accuracy: 0.9836 - loss: 0.4211 - val_accuracy: 0.9930 - val_loss: 0.4238
Epoch 56/100
40/40 - 0s - 9ms/step - accuracy: 0.9858 - loss: 0.4082 - val_accuracy: 1.0000 - val_loss: 0.4113
Epoch 57/100
40/40 - 0s - 6ms/step - accuracy: 0.9869 - loss: 0.3961 - val_accuracy: 1.0000 - val_loss: 0.3864
Epoch 58/100
40/40 - 0s - 7ms/step - accuracy: 0.9873 - loss: 0.3837 - val_accuracy: 1.0000 - val_loss: 0.3781
Epoch 59/100
40/40 - 0s - 5ms/step - accuracy: 0.9883 - loss: 0.3717 - val_accuracy: 1.0000 - val_loss: 0.3652
Epoch 60/100
40/40 - 0s - 5ms/step - accuracy: 0.9906 - loss: 0.3613 - val_accuracy: 0.9990 - val_loss: 0.3571
Epoch 61/100
40/40 - 0s - 8ms/step - accuracy: 0.9903 - loss: 0.3479 - val_accuracy: 1.0000 - val_loss: 0.3378
Epoch 62/100
40/40 - 0s - 8ms/step - accuracy: 0.9926 - loss: 0.3370 - val_accuracy: 1.0000 - val_loss: 0.3254
Epoch 63/100
40/40 - 0s - 5ms/step - accuracy: 0.9929 - loss: 0.3252 - val_accuracy: 1.0000 - val_loss: 0.3113
Epoch 64/100
40/40 - 0s - 5ms/step - accuracy: 0.9922 - loss: 0.3159 - val_accuracy: 1.0000 - val_loss: 0.3109
Epoch 65/100
40/40 - 0s - 8ms/step - accuracy: 0.9942 - loss: 0.3051 - val_accuracy: 1.0000 - val_loss: 0.2918
Epoch 66/100
40/40 - 0s - 6ms/step - accuracy: 0.9950 - loss: 0.2947 - val_accuracy: 1.0000 - val_loss: 0.2841
Epoch 67/100
40/40 - 0s - 7ms/step - accuracy: 0.9934 - loss: 0.2848 - val_accuracy: 1.0000 - val_loss: 0.2682
Epoch 68/100
40/40 - 0s - 5ms/step - accuracy: 0.9954 - loss: 0.2754 - val_accuracy: 1.0000 - val_loss: 0.2674
Epoch 69/100
40/40 - 0s - 5ms/step - accuracy: 0.9948 - loss: 0.2670 - val_accuracy: 1.0000 - val_loss: 0.2596
Epoch 70/100
40/40 - 0s - 8ms/step - accuracy: 0.9959 - loss: 0.2572 - val_accuracy: 1.0000 - val_loss: 0.2466
Epoch 71/100
40/40 - 0s - 8ms/step - accuracy: 0.9954 - loss: 0.2482 - val_accuracy: 1.0000 - val_loss: 0.2340
Epoch 72/100
40/40 - 0s - 5ms/step - accuracy: 0.9977 - loss: 0.2404 - val_accuracy: 1.0000 - val_loss: 0.2240
Epoch 73/100
40/40 - 0s - 5ms/step - accuracy: 0.9961 - loss: 0.2324 - val_accuracy: 1.0000 - val_loss: 0.2170
Epoch 74/100
40/40 - 0s - 8ms/step - accuracy: 0.9969 - loss: 0.2256 - val_accuracy: 1.0000 - val_loss: 0.2169
Epoch 75/100
40/40 - 0s - 6ms/step - accuracy: 0.9965 - loss: 0.2182 - val_accuracy: 1.0000 - val_loss: 0.2023
Epoch 76/100
40/40 - 0s - 7ms/step - accuracy: 0.9974 - loss: 0.2094 - val_accuracy: 1.0000 - val_loss: 0.1932
Epoch 77/100
40/40 - 0s - 5ms/step - accuracy: 0.9979 - loss: 0.2018 - val_accuracy: 1.0000 - val_loss: 0.1869
Epoch 78/100
40/40 - 0s - 5ms/step - accuracy: 0.9971 - loss: 0.1942 - val_accuracy: 1.0000 - val_loss: 0.1765
Epoch 79/100
40/40 - 0s - 7ms/step - accuracy: 0.9977 - loss: 0.1874 - val_accuracy: 1.0000 - val_loss: 0.1688
Epoch 80/100
40/40 - 0s - 8ms/step - accuracy: 0.9981 - loss: 0.1799 - val_accuracy: 1.0000 - val_loss: 0.1635
Epoch 81/100
40/40 - 0s - 5ms/step - accuracy: 0.9973 - loss: 0.1743 - val_accuracy: 1.0000 - val_loss: 0.1565
Epoch 82/100
40/40 - 0s - 5ms/step - accuracy: 0.9980 - loss: 0.1684 - val_accuracy: 1.0000 - val_loss: 0.1513
Epoch 83/100
40/40 - 0s - 8ms/step - accuracy: 0.9985 - loss: 0.1623 - val_accuracy: 1.0000 - val_loss: 0.1447
Epoch 84/100
40/40 - 0s - 6ms/step - accuracy: 0.9986 - loss: 0.1557 - val_accuracy: 1.0000 - val_loss: 0.1443
Epoch 85/100
40/40 - 0s - 7ms/step - accuracy: 0.9989 - loss: 0.1512 - val_accuracy: 1.0000 - val_loss: 0.1343
Epoch 86/100
40/40 - 0s - 5ms/step - accuracy: 0.9989 - loss: 0.1455 - val_accuracy: 1.0000 - val_loss: 0.1299
Epoch 87/100
40/40 - 0s - 5ms/step - accuracy: 0.9986 - loss: 0.1410 - val_accuracy: 1.0000 - val_loss: 0.1257
Epoch 88/100
40/40 - 0s - 9ms/step - accuracy: 0.9983 - loss: 0.1354 - val_accuracy: 1.0000 - val_loss: 0.1181
Epoch 89/100
40/40 - 0s - 6ms/step - accuracy: 0.9984 - loss: 0.1308 - val_accuracy: 1.0000 - val_loss: 0.1160
Epoch 90/100
40/40 - 0s - 7ms/step - accuracy: 0.9986 - loss: 0.1255 - val_accuracy: 1.0000 - val_loss: 0.1101
Epoch 91/100
40/40 - 0s - 5ms/step - accuracy: 0.9991 - loss: 0.1213 - val_accuracy: 1.0000 - val_loss: 0.1033
Epoch 92/100
40/40 - 0s - 8ms/step - accuracy: 0.9992 - loss: 0.1159 - val_accuracy: 1.0000 - val_loss: 0.0988
Epoch 93/100
40/40 - 0s - 6ms/step - accuracy: 0.9992 - loss: 0.1117 - val_accuracy: 1.0000 - val_loss: 0.0983
Epoch 94/100
40/40 - 0s - 7ms/step - accuracy: 0.9988 - loss: 0.1082 - val_accuracy: 1.0000 - val_loss: 0.0917
Epoch 95/100
40/40 - 0s - 5ms/step - accuracy: 0.9986 - loss: 0.1043 - val_accuracy: 1.0000 - val_loss: 0.0895
Epoch 96/100
40/40 - 0s - 5ms/step - accuracy: 0.9990 - loss: 0.1007 - val_accuracy: 1.0000 - val_loss: 0.0835
Epoch 97/100
40/40 - 0s - 8ms/step - accuracy: 0.9987 - loss: 0.0967 - val_accuracy: 1.0000 - val_loss: 0.0785
Epoch 98/100
40/40 - 0s - 6ms/step - accuracy: 0.9993 - loss: 0.0930 - val_accuracy: 1.0000 - val_loss: 0.0828
Epoch 99/100
40/40 - 0s - 7ms/step - accuracy: 0.9989 - loss: 0.0895 - val_accuracy: 1.0000 - val_loss: 0.0758
Epoch 100/100
40/40 - 0s - 5ms/step - accuracy: 0.9991 - loss: 0.0856 - val_accuracy: 1.0000 - val_loss: 0.0699
<keras.src.callbacks.history.History at 0x7f75d7cf4d70>
As we see, the network is essentially perfect now.