Keras and the Last Number Problem#
Let’s see if we can do better than our simple hidden layer NN with the last number problem.
import numpy as np
import keras
from keras.utils import to_categorical
We’ll use the same data class
class ModelDataCategorical:
"""this is the model data for our "last number" training set. We
produce input of length N, consisting of numbers 0-9 and store
the result in a 10-element array as categorical data.
"""
def __init__(self, N=10):
self.N = N
# our model input data
self.x = np.random.randint(0, high=10, size=N)
self.x_scaled = self.x / 10 + 0.05
# our scaled model output data
self.y = np.array([self.x[-1]])
self.y_scaled = np.zeros(10) + 0.01
self.y_scaled[self.x[-1]] = 0.99
def interpret_result(self, out):
"""take the network output and return the number we predict"""
return np.argmax(out)
For Keras, we need to pack the scaled data (both input and output) into arrays. We’ll use
the Keras to_categorical() to make the data categorical.
Let’s make both a training set and a test set
x_train = []
y_train = []
for _ in range(10000):
m = ModelDataCategorical()
x_train.append(m.x_scaled)
y_train.append(m.y)
x_train = np.asarray(x_train)
y_train = to_categorical(y_train, 10)
x_test = []
y_test = []
for _ in range(1000):
m = ModelDataCategorical()
x_test.append(m.x_scaled)
y_test.append(m.y)
x_test = np.asarray(x_test)
y_test = to_categorical(y_test, 10)
Check to make sure the data looks like we expect:
x_train[0]
array([0.65, 0.95, 0.05, 0.35, 0.55, 0.55, 0.65, 0.25, 0.15, 0.65])
y_train[0]
array([0., 0., 0., 0., 0., 0., 1., 0., 0., 0.])
Creating the network#
Now let’s build our network. We’ll use just a single hidden layer, but instead of the sigmoid used before, we’ll use RELU and the softmax activations.
from keras.models import Sequential
from keras.layers import Input, Dense, Dropout, Activation
from keras.optimizers import RMSprop
model = Sequential()
model.add(Input((10,)))
model.add(Dense(100, activation="relu"))
model.add(Dropout(0.1))
model.add(Dense(10, activation="softmax"))
rms = RMSprop()
model.compile(loss='categorical_crossentropy',
optimizer=rms, metrics=['accuracy'])
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 100) │ 1,100 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout (Dropout) │ (None, 100) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 10) │ 1,010 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 2,110 (8.24 KB)
Trainable params: 2,110 (8.24 KB)
Non-trainable params: 0 (0.00 B)
Now we have ~ 2k parameters to fit.
Training#
Now we can train and test each epoch to see how we do
epochs = 100
batch_size = 256
model.fit(x_train, y_train, epochs=epochs, batch_size=batch_size,
validation_data=(x_test, y_test), verbose=2)
Epoch 1/100
40/40 - 0s - 5ms/step - accuracy: 0.1497 - loss: 2.2489 - val_accuracy: 0.1760 - val_loss: 2.2039
Epoch 2/100
40/40 - 0s - 6ms/step - accuracy: 0.2098 - loss: 2.1455 - val_accuracy: 0.2190 - val_loss: 2.1062
Epoch 3/100
40/40 - 0s - 7ms/step - accuracy: 0.2304 - loss: 2.0400 - val_accuracy: 0.2190 - val_loss: 2.0093
Epoch 4/100
40/40 - 0s - 5ms/step - accuracy: 0.2544 - loss: 1.9399 - val_accuracy: 0.2340 - val_loss: 1.9112
Epoch 5/100
40/40 - 0s - 5ms/step - accuracy: 0.2878 - loss: 1.8433 - val_accuracy: 0.3300 - val_loss: 1.8161
Epoch 6/100
40/40 - 0s - 7ms/step - accuracy: 0.3200 - loss: 1.7581 - val_accuracy: 0.3310 - val_loss: 1.7332
Epoch 7/100
40/40 - 0s - 6ms/step - accuracy: 0.3607 - loss: 1.6774 - val_accuracy: 0.4190 - val_loss: 1.6609
Epoch 8/100
40/40 - 0s - 6ms/step - accuracy: 0.3926 - loss: 1.6118 - val_accuracy: 0.3950 - val_loss: 1.5988
Epoch 9/100
40/40 - 0s - 5ms/step - accuracy: 0.4220 - loss: 1.5487 - val_accuracy: 0.4700 - val_loss: 1.5321
Epoch 10/100
40/40 - 0s - 5ms/step - accuracy: 0.4496 - loss: 1.4912 - val_accuracy: 0.4930 - val_loss: 1.4747
Epoch 11/100
40/40 - 0s - 7ms/step - accuracy: 0.4652 - loss: 1.4401 - val_accuracy: 0.5290 - val_loss: 1.4241
Epoch 12/100
40/40 - 0s - 7ms/step - accuracy: 0.5003 - loss: 1.3879 - val_accuracy: 0.5860 - val_loss: 1.3768
Epoch 13/100
40/40 - 0s - 5ms/step - accuracy: 0.5220 - loss: 1.3414 - val_accuracy: 0.5780 - val_loss: 1.3318
Epoch 14/100
40/40 - 0s - 5ms/step - accuracy: 0.5434 - loss: 1.2990 - val_accuracy: 0.6490 - val_loss: 1.2876
Epoch 15/100
40/40 - 0s - 7ms/step - accuracy: 0.5631 - loss: 1.2585 - val_accuracy: 0.6410 - val_loss: 1.2393
Epoch 16/100
40/40 - 0s - 5ms/step - accuracy: 0.5811 - loss: 1.2202 - val_accuracy: 0.6600 - val_loss: 1.2045
Epoch 17/100
40/40 - 0s - 7ms/step - accuracy: 0.6027 - loss: 1.1816 - val_accuracy: 0.6440 - val_loss: 1.1713
Epoch 18/100
40/40 - 0s - 5ms/step - accuracy: 0.6142 - loss: 1.1519 - val_accuracy: 0.6780 - val_loss: 1.1431
Epoch 19/100
40/40 - 0s - 5ms/step - accuracy: 0.6356 - loss: 1.1190 - val_accuracy: 0.7090 - val_loss: 1.1017
Epoch 20/100
40/40 - 0s - 7ms/step - accuracy: 0.6633 - loss: 1.0846 - val_accuracy: 0.6650 - val_loss: 1.0766
Epoch 21/100
40/40 - 0s - 7ms/step - accuracy: 0.6689 - loss: 1.0592 - val_accuracy: 0.7510 - val_loss: 1.0459
Epoch 22/100
40/40 - 0s - 5ms/step - accuracy: 0.6836 - loss: 1.0300 - val_accuracy: 0.7650 - val_loss: 1.0152
Epoch 23/100
40/40 - 0s - 5ms/step - accuracy: 0.6962 - loss: 1.0052 - val_accuracy: 0.7310 - val_loss: 0.9898
Epoch 24/100
40/40 - 0s - 7ms/step - accuracy: 0.7133 - loss: 0.9785 - val_accuracy: 0.7680 - val_loss: 0.9619
Epoch 25/100
40/40 - 0s - 6ms/step - accuracy: 0.7318 - loss: 0.9518 - val_accuracy: 0.7770 - val_loss: 0.9408
Epoch 26/100
40/40 - 0s - 7ms/step - accuracy: 0.7455 - loss: 0.9283 - val_accuracy: 0.8510 - val_loss: 0.9079
Epoch 27/100
40/40 - 0s - 5ms/step - accuracy: 0.7621 - loss: 0.9049 - val_accuracy: 0.8350 - val_loss: 0.8858
Epoch 28/100
40/40 - 0s - 5ms/step - accuracy: 0.7653 - loss: 0.8832 - val_accuracy: 0.8130 - val_loss: 0.8640
Epoch 29/100
40/40 - 0s - 7ms/step - accuracy: 0.7853 - loss: 0.8585 - val_accuracy: 0.8080 - val_loss: 0.8467
Epoch 30/100
40/40 - 0s - 6ms/step - accuracy: 0.7999 - loss: 0.8349 - val_accuracy: 0.8830 - val_loss: 0.8173
Epoch 31/100
40/40 - 0s - 6ms/step - accuracy: 0.8195 - loss: 0.8102 - val_accuracy: 0.8770 - val_loss: 0.7967
Epoch 32/100
40/40 - 0s - 5ms/step - accuracy: 0.8277 - loss: 0.7931 - val_accuracy: 0.8760 - val_loss: 0.7785
Epoch 33/100
40/40 - 0s - 7ms/step - accuracy: 0.8414 - loss: 0.7708 - val_accuracy: 0.8890 - val_loss: 0.7548
Epoch 34/100
40/40 - 0s - 5ms/step - accuracy: 0.8530 - loss: 0.7486 - val_accuracy: 0.9180 - val_loss: 0.7340
Epoch 35/100
40/40 - 0s - 7ms/step - accuracy: 0.8638 - loss: 0.7275 - val_accuracy: 0.9130 - val_loss: 0.7102
Epoch 36/100
40/40 - 0s - 5ms/step - accuracy: 0.8785 - loss: 0.7090 - val_accuracy: 0.9370 - val_loss: 0.6924
Epoch 37/100
40/40 - 0s - 5ms/step - accuracy: 0.8894 - loss: 0.6892 - val_accuracy: 0.9560 - val_loss: 0.6736
Epoch 38/100
40/40 - 0s - 8ms/step - accuracy: 0.8918 - loss: 0.6713 - val_accuracy: 0.8610 - val_loss: 0.6692
Epoch 39/100
40/40 - 0s - 6ms/step - accuracy: 0.9025 - loss: 0.6514 - val_accuracy: 0.9590 - val_loss: 0.6340
Epoch 40/100
40/40 - 0s - 6ms/step - accuracy: 0.9155 - loss: 0.6350 - val_accuracy: 0.9130 - val_loss: 0.6257
Epoch 41/100
40/40 - 0s - 5ms/step - accuracy: 0.9165 - loss: 0.6187 - val_accuracy: 0.9610 - val_loss: 0.6035
Epoch 42/100
40/40 - 0s - 5ms/step - accuracy: 0.9273 - loss: 0.6008 - val_accuracy: 0.9660 - val_loss: 0.5880
Epoch 43/100
40/40 - 0s - 8ms/step - accuracy: 0.9352 - loss: 0.5852 - val_accuracy: 0.9610 - val_loss: 0.5703
Epoch 44/100
40/40 - 0s - 7ms/step - accuracy: 0.9419 - loss: 0.5671 - val_accuracy: 0.9830 - val_loss: 0.5568
Epoch 45/100
40/40 - 0s - 5ms/step - accuracy: 0.9499 - loss: 0.5527 - val_accuracy: 0.9920 - val_loss: 0.5327
Epoch 46/100
40/40 - 0s - 5ms/step - accuracy: 0.9511 - loss: 0.5387 - val_accuracy: 0.9810 - val_loss: 0.5218
Epoch 47/100
40/40 - 0s - 7ms/step - accuracy: 0.9564 - loss: 0.5218 - val_accuracy: 0.9870 - val_loss: 0.5081
Epoch 48/100
40/40 - 0s - 6ms/step - accuracy: 0.9620 - loss: 0.5057 - val_accuracy: 0.9830 - val_loss: 0.4926
Epoch 49/100
40/40 - 0s - 6ms/step - accuracy: 0.9697 - loss: 0.4898 - val_accuracy: 0.9910 - val_loss: 0.4772
Epoch 50/100
40/40 - 0s - 5ms/step - accuracy: 0.9704 - loss: 0.4781 - val_accuracy: 0.9860 - val_loss: 0.4669
Epoch 51/100
40/40 - 0s - 5ms/step - accuracy: 0.9719 - loss: 0.4650 - val_accuracy: 0.9920 - val_loss: 0.4494
Epoch 52/100
40/40 - 0s - 7ms/step - accuracy: 0.9777 - loss: 0.4492 - val_accuracy: 0.9990 - val_loss: 0.4323
Epoch 53/100
40/40 - 0s - 7ms/step - accuracy: 0.9775 - loss: 0.4361 - val_accuracy: 0.9970 - val_loss: 0.4242
Epoch 54/100
40/40 - 0s - 5ms/step - accuracy: 0.9825 - loss: 0.4215 - val_accuracy: 0.9960 - val_loss: 0.4170
Epoch 55/100
40/40 - 0s - 5ms/step - accuracy: 0.9804 - loss: 0.4146 - val_accuracy: 1.0000 - val_loss: 0.3965
Epoch 56/100
40/40 - 0s - 7ms/step - accuracy: 0.9857 - loss: 0.3993 - val_accuracy: 0.9980 - val_loss: 0.3891
Epoch 57/100
40/40 - 0s - 6ms/step - accuracy: 0.9870 - loss: 0.3874 - val_accuracy: 0.9990 - val_loss: 0.3769
Epoch 58/100
40/40 - 0s - 7ms/step - accuracy: 0.9867 - loss: 0.3770 - val_accuracy: 1.0000 - val_loss: 0.3634
Epoch 59/100
40/40 - 0s - 5ms/step - accuracy: 0.9882 - loss: 0.3648 - val_accuracy: 1.0000 - val_loss: 0.3556
Epoch 60/100
40/40 - 0s - 5ms/step - accuracy: 0.9895 - loss: 0.3545 - val_accuracy: 1.0000 - val_loss: 0.3442
Epoch 61/100
40/40 - 0s - 8ms/step - accuracy: 0.9913 - loss: 0.3425 - val_accuracy: 1.0000 - val_loss: 0.3291
Epoch 62/100
40/40 - 0s - 8ms/step - accuracy: 0.9919 - loss: 0.3318 - val_accuracy: 1.0000 - val_loss: 0.3191
Epoch 63/100
40/40 - 0s - 5ms/step - accuracy: 0.9932 - loss: 0.3204 - val_accuracy: 1.0000 - val_loss: 0.3077
Epoch 64/100
40/40 - 0s - 5ms/step - accuracy: 0.9928 - loss: 0.3109 - val_accuracy: 1.0000 - val_loss: 0.2990
Epoch 65/100
40/40 - 0s - 8ms/step - accuracy: 0.9929 - loss: 0.3007 - val_accuracy: 1.0000 - val_loss: 0.2885
Epoch 66/100
40/40 - 0s - 7ms/step - accuracy: 0.9949 - loss: 0.2909 - val_accuracy: 1.0000 - val_loss: 0.2793
Epoch 67/100
40/40 - 0s - 8ms/step - accuracy: 0.9951 - loss: 0.2808 - val_accuracy: 1.0000 - val_loss: 0.2652
Epoch 68/100
40/40 - 0s - 5ms/step - accuracy: 0.9946 - loss: 0.2711 - val_accuracy: 1.0000 - val_loss: 0.2633
Epoch 69/100
40/40 - 0s - 5ms/step - accuracy: 0.9954 - loss: 0.2623 - val_accuracy: 1.0000 - val_loss: 0.2565
Epoch 70/100
40/40 - 0s - 11ms/step - accuracy: 0.9963 - loss: 0.2544 - val_accuracy: 1.0000 - val_loss: 0.2411
Epoch 71/100
40/40 - 0s - 9ms/step - accuracy: 0.9963 - loss: 0.2447 - val_accuracy: 1.0000 - val_loss: 0.2301
Epoch 72/100
40/40 - 0s - 6ms/step - accuracy: 0.9978 - loss: 0.2359 - val_accuracy: 1.0000 - val_loss: 0.2283
Epoch 73/100
40/40 - 0s - 6ms/step - accuracy: 0.9972 - loss: 0.2287 - val_accuracy: 1.0000 - val_loss: 0.2163
Epoch 74/100
40/40 - 0s - 10ms/step - accuracy: 0.9976 - loss: 0.2205 - val_accuracy: 1.0000 - val_loss: 0.2041
Epoch 75/100
40/40 - 0s - 7ms/step - accuracy: 0.9981 - loss: 0.2121 - val_accuracy: 1.0000 - val_loss: 0.2008
Epoch 76/100
40/40 - 0s - 8ms/step - accuracy: 0.9980 - loss: 0.2053 - val_accuracy: 1.0000 - val_loss: 0.1947
Epoch 77/100
40/40 - 0s - 5ms/step - accuracy: 0.9978 - loss: 0.1981 - val_accuracy: 1.0000 - val_loss: 0.1820
Epoch 78/100
40/40 - 0s - 6ms/step - accuracy: 0.9984 - loss: 0.1898 - val_accuracy: 1.0000 - val_loss: 0.1739
Epoch 79/100
40/40 - 0s - 10ms/step - accuracy: 0.9984 - loss: 0.1837 - val_accuracy: 1.0000 - val_loss: 0.1657
Epoch 80/100
40/40 - 0s - 7ms/step - accuracy: 0.9976 - loss: 0.1773 - val_accuracy: 1.0000 - val_loss: 0.1632
Epoch 81/100
40/40 - 0s - 7ms/step - accuracy: 0.9985 - loss: 0.1714 - val_accuracy: 1.0000 - val_loss: 0.1573
Epoch 82/100
40/40 - 0s - 5ms/step - accuracy: 0.9992 - loss: 0.1646 - val_accuracy: 1.0000 - val_loss: 0.1513
Epoch 83/100
40/40 - 0s - 10ms/step - accuracy: 0.9992 - loss: 0.1577 - val_accuracy: 1.0000 - val_loss: 0.1462
Epoch 84/100
40/40 - 0s - 7ms/step - accuracy: 0.9987 - loss: 0.1533 - val_accuracy: 1.0000 - val_loss: 0.1366
Epoch 85/100
40/40 - 0s - 8ms/step - accuracy: 0.9992 - loss: 0.1466 - val_accuracy: 1.0000 - val_loss: 0.1350
Epoch 86/100
40/40 - 0s - 6ms/step - accuracy: 0.9988 - loss: 0.1426 - val_accuracy: 1.0000 - val_loss: 0.1336
Epoch 87/100
40/40 - 0s - 6ms/step - accuracy: 0.9983 - loss: 0.1381 - val_accuracy: 1.0000 - val_loss: 0.1244
Epoch 88/100
40/40 - 0s - 10ms/step - accuracy: 0.9989 - loss: 0.1316 - val_accuracy: 1.0000 - val_loss: 0.1206
Epoch 89/100
40/40 - 0s - 7ms/step - accuracy: 0.9994 - loss: 0.1267 - val_accuracy: 1.0000 - val_loss: 0.1145
Epoch 90/100
40/40 - 0s - 7ms/step - accuracy: 0.9993 - loss: 0.1220 - val_accuracy: 1.0000 - val_loss: 0.1080
Epoch 91/100
40/40 - 0s - 5ms/step - accuracy: 0.9993 - loss: 0.1180 - val_accuracy: 1.0000 - val_loss: 0.1012
Epoch 92/100
40/40 - 0s - 9ms/step - accuracy: 0.9984 - loss: 0.1133 - val_accuracy: 1.0000 - val_loss: 0.0993
Epoch 93/100
40/40 - 0s - 7ms/step - accuracy: 0.9996 - loss: 0.1093 - val_accuracy: 1.0000 - val_loss: 0.0935
Epoch 94/100
40/40 - 0s - 8ms/step - accuracy: 0.9993 - loss: 0.1045 - val_accuracy: 1.0000 - val_loss: 0.0909
Epoch 95/100
40/40 - 0s - 5ms/step - accuracy: 0.9996 - loss: 0.0996 - val_accuracy: 1.0000 - val_loss: 0.0883
Epoch 96/100
40/40 - 0s - 5ms/step - accuracy: 0.9994 - loss: 0.0960 - val_accuracy: 1.0000 - val_loss: 0.0838
Epoch 97/100
40/40 - 0s - 9ms/step - accuracy: 0.9996 - loss: 0.0924 - val_accuracy: 1.0000 - val_loss: 0.0790
Epoch 98/100
40/40 - 0s - 7ms/step - accuracy: 0.9999 - loss: 0.0890 - val_accuracy: 1.0000 - val_loss: 0.0760
Epoch 99/100
40/40 - 0s - 7ms/step - accuracy: 0.9992 - loss: 0.0860 - val_accuracy: 1.0000 - val_loss: 0.0707
Epoch 100/100
40/40 - 0s - 5ms/step - accuracy: 0.9996 - loss: 0.0816 - val_accuracy: 1.0000 - val_loss: 0.0676
<keras.src.callbacks.history.History at 0x7f5d493e9160>
As we see, the network is essentially perfect now.