Keras and the Last Number Problem#
Let’s see if we can do better than our simple hidden layer NN with the last number problem.
import numpy as np
import keras
from keras.utils import to_categorical
/opt/hostedtoolcache/Python/3.14.2/x64/lib/python3.14/site-packages/keras/src/export/tf2onnx_lib.py:8: FutureWarning: In the future `np.object` will be defined as the corresponding NumPy scalar.
if not hasattr(np, "object"):
We’ll use the same data class
class ModelDataCategorical:
"""this is the model data for our "last number" training set. We
produce input of length N, consisting of numbers 0-9 and store
the result in a 10-element array as categorical data.
"""
def __init__(self, N=10):
self.N = N
# our model input data
self.x = np.random.randint(0, high=10, size=N)
self.x_scaled = self.x / 10 + 0.05
# our scaled model output data
self.y = np.array([self.x[-1]])
self.y_scaled = np.zeros(10) + 0.01
self.y_scaled[self.x[-1]] = 0.99
def interpret_result(self, out):
"""take the network output and return the number we predict"""
return np.argmax(out)
For Keras, we need to pack the scaled data (both input and output) into arrays. We’ll use
the Keras to_categorical() to make the data categorical.
Let’s make both a training set and a test set
x_train = []
y_train = []
for _ in range(10000):
m = ModelDataCategorical()
x_train.append(m.x_scaled)
y_train.append(m.y)
x_train = np.asarray(x_train)
y_train = to_categorical(y_train, 10)
x_test = []
y_test = []
for _ in range(1000):
m = ModelDataCategorical()
x_test.append(m.x_scaled)
y_test.append(m.y)
x_test = np.asarray(x_test)
y_test = to_categorical(y_test, 10)
Check to make sure the data looks like we expect:
x_train[0]
array([0.45, 0.85, 0.65, 0.35, 0.35, 0.95, 0.05, 0.55, 0.45, 0.75])
y_train[0]
array([0., 0., 0., 0., 0., 0., 0., 1., 0., 0.])
Creating the network#
Now let’s build our network. We’ll use just a single hidden layer, but instead of the sigmoid used before, we’ll use RELU and the softmax activations.
from keras.models import Sequential
from keras.layers import Input, Dense, Dropout, Activation
from keras.optimizers import RMSprop
model = Sequential()
model.add(Input((10,)))
model.add(Dense(100, activation="relu"))
model.add(Dropout(0.1))
model.add(Dense(10, activation="softmax"))
rms = RMSprop()
model.compile(loss='categorical_crossentropy',
optimizer=rms, metrics=['accuracy'])
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 100) │ 1,100 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout (Dropout) │ (None, 100) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 10) │ 1,010 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 2,110 (8.24 KB)
Trainable params: 2,110 (8.24 KB)
Non-trainable params: 0 (0.00 B)
Now we have ~ 2k parameters to fit.
Training#
Now we can train and test each epoch to see how we do
epochs = 100
batch_size = 256
model.fit(x_train, y_train, epochs=epochs, batch_size=batch_size,
validation_data=(x_test, y_test), verbose=2)
Epoch 1/100
40/40 - 0s - 6ms/step - accuracy: 0.1397 - loss: 2.2792 - val_accuracy: 0.1930 - val_loss: 2.2348
Epoch 2/100
40/40 - 0s - 7ms/step - accuracy: 0.2177 - loss: 2.1919 - val_accuracy: 0.2470 - val_loss: 2.1493
Epoch 3/100
40/40 - 0s - 6ms/step - accuracy: 0.2550 - loss: 2.0961 - val_accuracy: 0.2540 - val_loss: 2.0557
Epoch 4/100
40/40 - 0s - 7ms/step - accuracy: 0.2779 - loss: 1.9945 - val_accuracy: 0.2660 - val_loss: 1.9528
Epoch 5/100
40/40 - 0s - 6ms/step - accuracy: 0.3042 - loss: 1.8995 - val_accuracy: 0.3200 - val_loss: 1.8591
Epoch 6/100
40/40 - 0s - 8ms/step - accuracy: 0.3314 - loss: 1.8087 - val_accuracy: 0.3750 - val_loss: 1.7674
Epoch 7/100
40/40 - 0s - 6ms/step - accuracy: 0.3631 - loss: 1.7247 - val_accuracy: 0.3330 - val_loss: 1.6963
Epoch 8/100
40/40 - 0s - 8ms/step - accuracy: 0.3804 - loss: 1.6538 - val_accuracy: 0.3960 - val_loss: 1.6231
Epoch 9/100
40/40 - 0s - 5ms/step - accuracy: 0.4125 - loss: 1.5875 - val_accuracy: 0.4380 - val_loss: 1.5564
Epoch 10/100
40/40 - 0s - 6ms/step - accuracy: 0.4338 - loss: 1.5275 - val_accuracy: 0.4580 - val_loss: 1.5010
Epoch 11/100
40/40 - 0s - 8ms/step - accuracy: 0.4492 - loss: 1.4752 - val_accuracy: 0.5030 - val_loss: 1.4516
Epoch 12/100
40/40 - 0s - 6ms/step - accuracy: 0.4867 - loss: 1.4210 - val_accuracy: 0.5850 - val_loss: 1.3956
Epoch 13/100
40/40 - 0s - 7ms/step - accuracy: 0.5062 - loss: 1.3751 - val_accuracy: 0.5420 - val_loss: 1.3545
Epoch 14/100
40/40 - 0s - 5ms/step - accuracy: 0.5210 - loss: 1.3316 - val_accuracy: 0.5480 - val_loss: 1.3063
Epoch 15/100
40/40 - 0s - 8ms/step - accuracy: 0.5426 - loss: 1.2904 - val_accuracy: 0.5470 - val_loss: 1.2727
Epoch 16/100
40/40 - 0s - 6ms/step - accuracy: 0.5598 - loss: 1.2509 - val_accuracy: 0.6410 - val_loss: 1.2238
Epoch 17/100
40/40 - 0s - 7ms/step - accuracy: 0.5777 - loss: 1.2140 - val_accuracy: 0.7070 - val_loss: 1.1913
Epoch 18/100
40/40 - 0s - 5ms/step - accuracy: 0.5984 - loss: 1.1800 - val_accuracy: 0.7210 - val_loss: 1.1569
Epoch 19/100
40/40 - 0s - 5ms/step - accuracy: 0.6107 - loss: 1.1490 - val_accuracy: 0.7280 - val_loss: 1.1242
Epoch 20/100
40/40 - 0s - 8ms/step - accuracy: 0.6303 - loss: 1.1194 - val_accuracy: 0.6380 - val_loss: 1.0979
Epoch 21/100
40/40 - 0s - 8ms/step - accuracy: 0.6535 - loss: 1.0863 - val_accuracy: 0.6080 - val_loss: 1.0678
Epoch 22/100
40/40 - 0s - 5ms/step - accuracy: 0.6551 - loss: 1.0607 - val_accuracy: 0.7130 - val_loss: 1.0397
Epoch 23/100
40/40 - 0s - 5ms/step - accuracy: 0.6781 - loss: 1.0319 - val_accuracy: 0.7590 - val_loss: 1.0161
Epoch 24/100
40/40 - 0s - 8ms/step - accuracy: 0.6823 - loss: 1.0077 - val_accuracy: 0.7800 - val_loss: 0.9851
Epoch 25/100
40/40 - 0s - 6ms/step - accuracy: 0.7044 - loss: 0.9859 - val_accuracy: 0.8380 - val_loss: 0.9587
Epoch 26/100
40/40 - 0s - 7ms/step - accuracy: 0.7210 - loss: 0.9600 - val_accuracy: 0.8290 - val_loss: 0.9344
Epoch 27/100
40/40 - 0s - 6ms/step - accuracy: 0.7418 - loss: 0.9336 - val_accuracy: 0.8700 - val_loss: 0.9069
Epoch 28/100
40/40 - 0s - 5ms/step - accuracy: 0.7574 - loss: 0.9117 - val_accuracy: 0.8890 - val_loss: 0.8852
Epoch 29/100
40/40 - 0s - 8ms/step - accuracy: 0.7716 - loss: 0.8880 - val_accuracy: 0.8730 - val_loss: 0.8667
Epoch 30/100
40/40 - 0s - 8ms/step - accuracy: 0.7776 - loss: 0.8677 - val_accuracy: 0.8730 - val_loss: 0.8432
Epoch 31/100
40/40 - 0s - 5ms/step - accuracy: 0.7915 - loss: 0.8449 - val_accuracy: 0.8640 - val_loss: 0.8198
Epoch 32/100
40/40 - 0s - 5ms/step - accuracy: 0.8125 - loss: 0.8224 - val_accuracy: 0.8990 - val_loss: 0.8025
Epoch 33/100
40/40 - 0s - 9ms/step - accuracy: 0.8253 - loss: 0.8013 - val_accuracy: 0.9000 - val_loss: 0.7787
Epoch 34/100
40/40 - 0s - 7ms/step - accuracy: 0.8413 - loss: 0.7830 - val_accuracy: 0.9070 - val_loss: 0.7632
Epoch 35/100
40/40 - 0s - 7ms/step - accuracy: 0.8547 - loss: 0.7614 - val_accuracy: 0.9390 - val_loss: 0.7366
Epoch 36/100
40/40 - 0s - 6ms/step - accuracy: 0.8639 - loss: 0.7413 - val_accuracy: 0.9590 - val_loss: 0.7152
Epoch 37/100
40/40 - 0s - 5ms/step - accuracy: 0.8816 - loss: 0.7240 - val_accuracy: 0.9300 - val_loss: 0.6993
Epoch 38/100
40/40 - 0s - 9ms/step - accuracy: 0.8933 - loss: 0.7036 - val_accuracy: 0.9530 - val_loss: 0.6804
Epoch 39/100
40/40 - 0s - 8ms/step - accuracy: 0.9048 - loss: 0.6842 - val_accuracy: 0.9610 - val_loss: 0.6690
Epoch 40/100
40/40 - 0s - 6ms/step - accuracy: 0.9090 - loss: 0.6686 - val_accuracy: 0.9370 - val_loss: 0.6493
Epoch 41/100
40/40 - 0s - 6ms/step - accuracy: 0.9205 - loss: 0.6502 - val_accuracy: 0.9580 - val_loss: 0.6328
Epoch 42/100
40/40 - 0s - 8ms/step - accuracy: 0.9292 - loss: 0.6329 - val_accuracy: 0.9720 - val_loss: 0.6078
Epoch 43/100
40/40 - 0s - 7ms/step - accuracy: 0.9398 - loss: 0.6148 - val_accuracy: 0.9790 - val_loss: 0.6021
Epoch 44/100
40/40 - 0s - 7ms/step - accuracy: 0.9440 - loss: 0.5985 - val_accuracy: 0.9930 - val_loss: 0.5739
Epoch 45/100
40/40 - 0s - 6ms/step - accuracy: 0.9509 - loss: 0.5801 - val_accuracy: 0.9950 - val_loss: 0.5575
Epoch 46/100
40/40 - 0s - 8ms/step - accuracy: 0.9567 - loss: 0.5645 - val_accuracy: 0.9900 - val_loss: 0.5462
Epoch 47/100
40/40 - 0s - 6ms/step - accuracy: 0.9644 - loss: 0.5472 - val_accuracy: 0.9980 - val_loss: 0.5279
Epoch 48/100
40/40 - 0s - 8ms/step - accuracy: 0.9695 - loss: 0.5330 - val_accuracy: 0.9920 - val_loss: 0.5133
Epoch 49/100
40/40 - 0s - 6ms/step - accuracy: 0.9697 - loss: 0.5163 - val_accuracy: 0.9980 - val_loss: 0.5037
Epoch 50/100
40/40 - 0s - 6ms/step - accuracy: 0.9757 - loss: 0.5010 - val_accuracy: 0.9960 - val_loss: 0.4868
Epoch 51/100
40/40 - 0s - 8ms/step - accuracy: 0.9760 - loss: 0.4869 - val_accuracy: 1.0000 - val_loss: 0.4676
Epoch 52/100
40/40 - 0s - 7ms/step - accuracy: 0.9787 - loss: 0.4725 - val_accuracy: 0.9990 - val_loss: 0.4542
Epoch 53/100
40/40 - 0s - 7ms/step - accuracy: 0.9819 - loss: 0.4593 - val_accuracy: 0.9950 - val_loss: 0.4420
Epoch 54/100
40/40 - 0s - 6ms/step - accuracy: 0.9842 - loss: 0.4454 - val_accuracy: 0.9990 - val_loss: 0.4252
Epoch 55/100
40/40 - 0s - 8ms/step - accuracy: 0.9831 - loss: 0.4322 - val_accuracy: 0.9980 - val_loss: 0.4135
Epoch 56/100
40/40 - 0s - 6ms/step - accuracy: 0.9848 - loss: 0.4200 - val_accuracy: 0.9980 - val_loss: 0.4073
Epoch 57/100
40/40 - 0s - 8ms/step - accuracy: 0.9876 - loss: 0.4043 - val_accuracy: 1.0000 - val_loss: 0.3931
Epoch 58/100
40/40 - 0s - 6ms/step - accuracy: 0.9886 - loss: 0.3926 - val_accuracy: 1.0000 - val_loss: 0.3754
Epoch 59/100
40/40 - 0s - 6ms/step - accuracy: 0.9916 - loss: 0.3807 - val_accuracy: 1.0000 - val_loss: 0.3637
Epoch 60/100
40/40 - 0s - 8ms/step - accuracy: 0.9895 - loss: 0.3702 - val_accuracy: 1.0000 - val_loss: 0.3480
Epoch 61/100
40/40 - 0s - 8ms/step - accuracy: 0.9916 - loss: 0.3569 - val_accuracy: 1.0000 - val_loss: 0.3459
Epoch 62/100
40/40 - 0s - 6ms/step - accuracy: 0.9932 - loss: 0.3456 - val_accuracy: 1.0000 - val_loss: 0.3342
Epoch 63/100
40/40 - 0s - 6ms/step - accuracy: 0.9923 - loss: 0.3353 - val_accuracy: 1.0000 - val_loss: 0.3199
Epoch 64/100
40/40 - 0s - 8ms/step - accuracy: 0.9934 - loss: 0.3241 - val_accuracy: 1.0000 - val_loss: 0.3118
Epoch 65/100
40/40 - 0s - 7ms/step - accuracy: 0.9945 - loss: 0.3134 - val_accuracy: 1.0000 - val_loss: 0.2937
Epoch 66/100
40/40 - 0s - 8ms/step - accuracy: 0.9935 - loss: 0.3026 - val_accuracy: 1.0000 - val_loss: 0.2893
Epoch 67/100
40/40 - 0s - 6ms/step - accuracy: 0.9945 - loss: 0.2934 - val_accuracy: 1.0000 - val_loss: 0.2797
Epoch 68/100
40/40 - 0s - 6ms/step - accuracy: 0.9937 - loss: 0.2834 - val_accuracy: 1.0000 - val_loss: 0.2638
Epoch 69/100
40/40 - 0s - 9ms/step - accuracy: 0.9959 - loss: 0.2730 - val_accuracy: 1.0000 - val_loss: 0.2591
Epoch 70/100
40/40 - 0s - 8ms/step - accuracy: 0.9957 - loss: 0.2646 - val_accuracy: 1.0000 - val_loss: 0.2473
Epoch 71/100
40/40 - 0s - 6ms/step - accuracy: 0.9965 - loss: 0.2553 - val_accuracy: 1.0000 - val_loss: 0.2344
Epoch 72/100
40/40 - 0s - 6ms/step - accuracy: 0.9967 - loss: 0.2465 - val_accuracy: 1.0000 - val_loss: 0.2293
Epoch 73/100
40/40 - 0s - 9ms/step - accuracy: 0.9965 - loss: 0.2377 - val_accuracy: 1.0000 - val_loss: 0.2190
Epoch 74/100
40/40 - 0s - 6ms/step - accuracy: 0.9978 - loss: 0.2294 - val_accuracy: 1.0000 - val_loss: 0.2125
Epoch 75/100
40/40 - 0s - 8ms/step - accuracy: 0.9979 - loss: 0.2206 - val_accuracy: 1.0000 - val_loss: 0.2047
Epoch 76/100
40/40 - 0s - 6ms/step - accuracy: 0.9983 - loss: 0.2123 - val_accuracy: 1.0000 - val_loss: 0.1968
Epoch 77/100
40/40 - 0s - 6ms/step - accuracy: 0.9985 - loss: 0.2051 - val_accuracy: 1.0000 - val_loss: 0.1867
Epoch 78/100
40/40 - 0s - 8ms/step - accuracy: 0.9982 - loss: 0.1973 - val_accuracy: 1.0000 - val_loss: 0.1805
Epoch 79/100
40/40 - 0s - 8ms/step - accuracy: 0.9983 - loss: 0.1901 - val_accuracy: 1.0000 - val_loss: 0.1749
Epoch 80/100
40/40 - 0s - 6ms/step - accuracy: 0.9984 - loss: 0.1829 - val_accuracy: 1.0000 - val_loss: 0.1676
Epoch 81/100
40/40 - 0s - 6ms/step - accuracy: 0.9983 - loss: 0.1770 - val_accuracy: 1.0000 - val_loss: 0.1580
Epoch 82/100
40/40 - 0s - 10ms/step - accuracy: 0.9987 - loss: 0.1701 - val_accuracy: 1.0000 - val_loss: 0.1527
Epoch 83/100
40/40 - 0s - 7ms/step - accuracy: 0.9985 - loss: 0.1651 - val_accuracy: 1.0000 - val_loss: 0.1521
Epoch 84/100
40/40 - 0s - 8ms/step - accuracy: 0.9989 - loss: 0.1589 - val_accuracy: 1.0000 - val_loss: 0.1397
Epoch 85/100
40/40 - 0s - 6ms/step - accuracy: 0.9991 - loss: 0.1518 - val_accuracy: 1.0000 - val_loss: 0.1377
Epoch 86/100
40/40 - 0s - 6ms/step - accuracy: 0.9989 - loss: 0.1475 - val_accuracy: 1.0000 - val_loss: 0.1301
Epoch 87/100
40/40 - 0s - 11ms/step - accuracy: 0.9985 - loss: 0.1423 - val_accuracy: 1.0000 - val_loss: 0.1262
Epoch 88/100
40/40 - 0s - 8ms/step - accuracy: 0.9992 - loss: 0.1378 - val_accuracy: 1.0000 - val_loss: 0.1186
Epoch 89/100
40/40 - 0s - 6ms/step - accuracy: 0.9990 - loss: 0.1313 - val_accuracy: 1.0000 - val_loss: 0.1147
Epoch 90/100
40/40 - 0s - 6ms/step - accuracy: 0.9994 - loss: 0.1272 - val_accuracy: 1.0000 - val_loss: 0.1090
Epoch 91/100
40/40 - 0s - 9ms/step - accuracy: 0.9993 - loss: 0.1218 - val_accuracy: 1.0000 - val_loss: 0.1066
Epoch 92/100
40/40 - 0s - 7ms/step - accuracy: 0.9991 - loss: 0.1186 - val_accuracy: 1.0000 - val_loss: 0.0998
Epoch 93/100
40/40 - 0s - 7ms/step - accuracy: 0.9992 - loss: 0.1125 - val_accuracy: 1.0000 - val_loss: 0.0977
Epoch 94/100
40/40 - 0s - 6ms/step - accuracy: 0.9995 - loss: 0.1093 - val_accuracy: 1.0000 - val_loss: 0.0941
Epoch 95/100
40/40 - 0s - 8ms/step - accuracy: 0.9989 - loss: 0.1049 - val_accuracy: 1.0000 - val_loss: 0.0877
Epoch 96/100
40/40 - 0s - 6ms/step - accuracy: 0.9993 - loss: 0.1009 - val_accuracy: 1.0000 - val_loss: 0.0847
Epoch 97/100
40/40 - 0s - 8ms/step - accuracy: 0.9993 - loss: 0.0971 - val_accuracy: 1.0000 - val_loss: 0.0803
Epoch 98/100
40/40 - 0s - 6ms/step - accuracy: 0.9997 - loss: 0.0935 - val_accuracy: 1.0000 - val_loss: 0.0762
Epoch 99/100
40/40 - 0s - 6ms/step - accuracy: 0.9995 - loss: 0.0891 - val_accuracy: 1.0000 - val_loss: 0.0755
Epoch 100/100
40/40 - 0s - 9ms/step - accuracy: 0.9991 - loss: 0.0871 - val_accuracy: 1.0000 - val_loss: 0.0727
<keras.src.callbacks.history.History at 0x7f9ee486e510>
As we see, the network is essentially perfect now.