Keras and the Last Number Problem#
Let’s see if we can do better than our simple hidden layer NN with the last number problem.
import numpy as np
import keras
from keras.utils import to_categorical
2025-09-11 22:13:33.330489: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:31] Could not find cuda drivers on your machine, GPU will not be used.
2025-09-11 22:13:33.375787: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2025-09-11 22:13:34.923219: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:31] Could not find cuda drivers on your machine, GPU will not be used.
We’ll use the same data class
class ModelDataCategorical:
"""this is the model data for our "last number" training set. We
produce input of length N, consisting of numbers 0-9 and store
the result in a 10-element array as categorical data.
"""
def __init__(self, N=10):
self.N = N
# our model input data
self.x = np.random.randint(0, high=10, size=N)
self.x_scaled = self.x / 10 + 0.05
# our scaled model output data
self.y = np.array([self.x[-1]])
self.y_scaled = np.zeros(10) + 0.01
self.y_scaled[self.x[-1]] = 0.99
def interpret_result(self, out):
"""take the network output and return the number we predict"""
return np.argmax(out)
For Keras, we need to pack the scaled data (both input and output) into arrays. We’ll use
the Keras to_categorical()
to make the data categorical.
Let’s make both a training set and a test set
x_train = []
y_train = []
for _ in range(10000):
m = ModelDataCategorical()
x_train.append(m.x_scaled)
y_train.append(m.y)
x_train = np.asarray(x_train)
y_train = to_categorical(y_train, 10)
x_test = []
y_test = []
for _ in range(1000):
m = ModelDataCategorical()
x_test.append(m.x_scaled)
y_test.append(m.y)
x_test = np.asarray(x_test)
y_test = to_categorical(y_test, 10)
Check to make sure the data looks like we expect:
x_train[0]
array([0.85, 0.95, 0.55, 0.15, 0.55, 0.15, 0.65, 0.35, 0.25, 0.85])
y_train[0]
array([0., 0., 0., 0., 0., 0., 0., 0., 1., 0.])
Creating the network#
Now let’s build our network. We’ll use just a single hidden layer, but instead of the sigmoid used before, we’ll use RELU and the softmax activations.
from keras.models import Sequential
from keras.layers import Input, Dense, Dropout, Activation
from keras.optimizers import RMSprop
model = Sequential()
model.add(Input((10,)))
model.add(Dense(100, activation="relu"))
model.add(Dropout(0.1))
model.add(Dense(10, activation="softmax"))
2025-09-11 22:13:36.128066: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
rms = RMSprop()
model.compile(loss='categorical_crossentropy',
optimizer=rms, metrics=['accuracy'])
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 100) │ 1,100 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout (Dropout) │ (None, 100) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 10) │ 1,010 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 2,110 (8.24 KB)
Trainable params: 2,110 (8.24 KB)
Non-trainable params: 0 (0.00 B)
Now we have ~ 2k parameters to fit.
Training#
Now we can train and test each epoch to see how we do
epochs = 100
batch_size = 256
model.fit(x_train, y_train, epochs=epochs, batch_size=batch_size,
validation_data=(x_test, y_test), verbose=2)
Epoch 1/100
40/40 - 1s - 16ms/step - accuracy: 0.1406 - loss: 2.2660 - val_accuracy: 0.1650 - val_loss: 2.2089
Epoch 2/100
40/40 - 0s - 2ms/step - accuracy: 0.1998 - loss: 2.1739 - val_accuracy: 0.2420 - val_loss: 2.1228
Epoch 3/100
40/40 - 0s - 2ms/step - accuracy: 0.2450 - loss: 2.0777 - val_accuracy: 0.2700 - val_loss: 2.0216
Epoch 4/100
40/40 - 0s - 2ms/step - accuracy: 0.2762 - loss: 1.9798 - val_accuracy: 0.3120 - val_loss: 1.9292
Epoch 5/100
40/40 - 0s - 2ms/step - accuracy: 0.3099 - loss: 1.8882 - val_accuracy: 0.2990 - val_loss: 1.8395
Epoch 6/100
40/40 - 0s - 2ms/step - accuracy: 0.3403 - loss: 1.7985 - val_accuracy: 0.3730 - val_loss: 1.7547
Epoch 7/100
40/40 - 0s - 2ms/step - accuracy: 0.3678 - loss: 1.7194 - val_accuracy: 0.3610 - val_loss: 1.6939
Epoch 8/100
40/40 - 0s - 2ms/step - accuracy: 0.3901 - loss: 1.6502 - val_accuracy: 0.4020 - val_loss: 1.6214
Epoch 9/100
40/40 - 0s - 2ms/step - accuracy: 0.4182 - loss: 1.5846 - val_accuracy: 0.4520 - val_loss: 1.5565
Epoch 10/100
40/40 - 0s - 2ms/step - accuracy: 0.4511 - loss: 1.5232 - val_accuracy: 0.5140 - val_loss: 1.4989
Epoch 11/100
40/40 - 0s - 2ms/step - accuracy: 0.4688 - loss: 1.4661 - val_accuracy: 0.5120 - val_loss: 1.4455
Epoch 12/100
40/40 - 0s - 2ms/step - accuracy: 0.4959 - loss: 1.4132 - val_accuracy: 0.4850 - val_loss: 1.3983
Epoch 13/100
40/40 - 0s - 2ms/step - accuracy: 0.5152 - loss: 1.3650 - val_accuracy: 0.5050 - val_loss: 1.3605
Epoch 14/100
40/40 - 0s - 2ms/step - accuracy: 0.5373 - loss: 1.3211 - val_accuracy: 0.5380 - val_loss: 1.3084
Epoch 15/100
40/40 - 0s - 2ms/step - accuracy: 0.5660 - loss: 1.2799 - val_accuracy: 0.5750 - val_loss: 1.2738
Epoch 16/100
40/40 - 0s - 2ms/step - accuracy: 0.5655 - loss: 1.2446 - val_accuracy: 0.6950 - val_loss: 1.2319
Epoch 17/100
40/40 - 0s - 2ms/step - accuracy: 0.5969 - loss: 1.2025 - val_accuracy: 0.6350 - val_loss: 1.1968
Epoch 18/100
40/40 - 0s - 2ms/step - accuracy: 0.6049 - loss: 1.1709 - val_accuracy: 0.7080 - val_loss: 1.1579
Epoch 19/100
40/40 - 0s - 2ms/step - accuracy: 0.6365 - loss: 1.1370 - val_accuracy: 0.7070 - val_loss: 1.1303
Epoch 20/100
40/40 - 0s - 2ms/step - accuracy: 0.6543 - loss: 1.1046 - val_accuracy: 0.6450 - val_loss: 1.0923
Epoch 21/100
40/40 - 0s - 2ms/step - accuracy: 0.6618 - loss: 1.0744 - val_accuracy: 0.7530 - val_loss: 1.0733
Epoch 22/100
40/40 - 0s - 2ms/step - accuracy: 0.6838 - loss: 1.0469 - val_accuracy: 0.6340 - val_loss: 1.0465
Epoch 23/100
40/40 - 0s - 2ms/step - accuracy: 0.7018 - loss: 1.0180 - val_accuracy: 0.7160 - val_loss: 1.0194
Epoch 24/100
40/40 - 0s - 2ms/step - accuracy: 0.7120 - loss: 0.9946 - val_accuracy: 0.7660 - val_loss: 0.9858
Epoch 25/100
40/40 - 0s - 2ms/step - accuracy: 0.7371 - loss: 0.9696 - val_accuracy: 0.7730 - val_loss: 0.9544
Epoch 26/100
40/40 - 0s - 2ms/step - accuracy: 0.7457 - loss: 0.9458 - val_accuracy: 0.8040 - val_loss: 0.9357
Epoch 27/100
40/40 - 0s - 2ms/step - accuracy: 0.7662 - loss: 0.9152 - val_accuracy: 0.8220 - val_loss: 0.9077
Epoch 28/100
40/40 - 0s - 2ms/step - accuracy: 0.7743 - loss: 0.8981 - val_accuracy: 0.7950 - val_loss: 0.8939
Epoch 29/100
40/40 - 0s - 2ms/step - accuracy: 0.7926 - loss: 0.8694 - val_accuracy: 0.8200 - val_loss: 0.8641
Epoch 30/100
40/40 - 0s - 2ms/step - accuracy: 0.8012 - loss: 0.8501 - val_accuracy: 0.8670 - val_loss: 0.8422
Epoch 31/100
40/40 - 0s - 2ms/step - accuracy: 0.8209 - loss: 0.8277 - val_accuracy: 0.8800 - val_loss: 0.8195
Epoch 32/100
40/40 - 0s - 2ms/step - accuracy: 0.8285 - loss: 0.8083 - val_accuracy: 0.8390 - val_loss: 0.7996
Epoch 33/100
40/40 - 0s - 2ms/step - accuracy: 0.8347 - loss: 0.7885 - val_accuracy: 0.8610 - val_loss: 0.7827
Epoch 34/100
40/40 - 0s - 2ms/step - accuracy: 0.8475 - loss: 0.7665 - val_accuracy: 0.8350 - val_loss: 0.7583
Epoch 35/100
40/40 - 0s - 2ms/step - accuracy: 0.8549 - loss: 0.7507 - val_accuracy: 0.9150 - val_loss: 0.7363
Epoch 36/100
40/40 - 0s - 2ms/step - accuracy: 0.8680 - loss: 0.7282 - val_accuracy: 0.8690 - val_loss: 0.7286
Epoch 37/100
40/40 - 0s - 2ms/step - accuracy: 0.8738 - loss: 0.7095 - val_accuracy: 0.9120 - val_loss: 0.7023
Epoch 38/100
40/40 - 0s - 2ms/step - accuracy: 0.8872 - loss: 0.6907 - val_accuracy: 0.9170 - val_loss: 0.6870
Epoch 39/100
40/40 - 0s - 2ms/step - accuracy: 0.8962 - loss: 0.6723 - val_accuracy: 0.9180 - val_loss: 0.6647
Epoch 40/100
40/40 - 0s - 2ms/step - accuracy: 0.8998 - loss: 0.6532 - val_accuracy: 0.9320 - val_loss: 0.6455
Epoch 41/100
40/40 - 0s - 2ms/step - accuracy: 0.9091 - loss: 0.6351 - val_accuracy: 0.9220 - val_loss: 0.6294
Epoch 42/100
40/40 - 0s - 2ms/step - accuracy: 0.9208 - loss: 0.6172 - val_accuracy: 0.9530 - val_loss: 0.6105
Epoch 43/100
40/40 - 0s - 2ms/step - accuracy: 0.9254 - loss: 0.6001 - val_accuracy: 0.9750 - val_loss: 0.5856
Epoch 44/100
40/40 - 0s - 2ms/step - accuracy: 0.9360 - loss: 0.5827 - val_accuracy: 0.9660 - val_loss: 0.5731
Epoch 45/100
40/40 - 0s - 2ms/step - accuracy: 0.9390 - loss: 0.5674 - val_accuracy: 0.9790 - val_loss: 0.5536
Epoch 46/100
40/40 - 0s - 2ms/step - accuracy: 0.9455 - loss: 0.5516 - val_accuracy: 0.9760 - val_loss: 0.5403
Epoch 47/100
40/40 - 0s - 2ms/step - accuracy: 0.9495 - loss: 0.5348 - val_accuracy: 0.9680 - val_loss: 0.5298
Epoch 48/100
40/40 - 0s - 2ms/step - accuracy: 0.9537 - loss: 0.5196 - val_accuracy: 0.9890 - val_loss: 0.5133
Epoch 49/100
40/40 - 0s - 2ms/step - accuracy: 0.9602 - loss: 0.5043 - val_accuracy: 0.9960 - val_loss: 0.4947
Epoch 50/100
40/40 - 0s - 2ms/step - accuracy: 0.9658 - loss: 0.4900 - val_accuracy: 0.9980 - val_loss: 0.4828
Epoch 51/100
40/40 - 0s - 2ms/step - accuracy: 0.9640 - loss: 0.4771 - val_accuracy: 0.9870 - val_loss: 0.4780
Epoch 52/100
40/40 - 0s - 2ms/step - accuracy: 0.9703 - loss: 0.4652 - val_accuracy: 0.9870 - val_loss: 0.4575
Epoch 53/100
40/40 - 0s - 2ms/step - accuracy: 0.9730 - loss: 0.4508 - val_accuracy: 0.9980 - val_loss: 0.4397
Epoch 54/100
40/40 - 0s - 2ms/step - accuracy: 0.9739 - loss: 0.4384 - val_accuracy: 0.9980 - val_loss: 0.4344
Epoch 55/100
40/40 - 0s - 2ms/step - accuracy: 0.9766 - loss: 0.4253 - val_accuracy: 1.0000 - val_loss: 0.4180
Epoch 56/100
40/40 - 0s - 2ms/step - accuracy: 0.9796 - loss: 0.4126 - val_accuracy: 1.0000 - val_loss: 0.4020
Epoch 57/100
40/40 - 0s - 2ms/step - accuracy: 0.9827 - loss: 0.3997 - val_accuracy: 1.0000 - val_loss: 0.4031
Epoch 58/100
40/40 - 0s - 2ms/step - accuracy: 0.9855 - loss: 0.3893 - val_accuracy: 0.9990 - val_loss: 0.3756
Epoch 59/100
40/40 - 0s - 2ms/step - accuracy: 0.9869 - loss: 0.3762 - val_accuracy: 1.0000 - val_loss: 0.3657
Epoch 60/100
40/40 - 0s - 2ms/step - accuracy: 0.9854 - loss: 0.3672 - val_accuracy: 1.0000 - val_loss: 0.3509
Epoch 61/100
40/40 - 0s - 2ms/step - accuracy: 0.9891 - loss: 0.3539 - val_accuracy: 1.0000 - val_loss: 0.3390
Epoch 62/100
40/40 - 0s - 2ms/step - accuracy: 0.9894 - loss: 0.3444 - val_accuracy: 1.0000 - val_loss: 0.3298
Epoch 63/100
40/40 - 0s - 2ms/step - accuracy: 0.9880 - loss: 0.3336 - val_accuracy: 1.0000 - val_loss: 0.3214
Epoch 64/100
40/40 - 0s - 2ms/step - accuracy: 0.9908 - loss: 0.3223 - val_accuracy: 1.0000 - val_loss: 0.3067
Epoch 65/100
40/40 - 0s - 2ms/step - accuracy: 0.9908 - loss: 0.3139 - val_accuracy: 1.0000 - val_loss: 0.3064
Epoch 66/100
40/40 - 0s - 2ms/step - accuracy: 0.9925 - loss: 0.3034 - val_accuracy: 1.0000 - val_loss: 0.2973
Epoch 67/100
40/40 - 0s - 2ms/step - accuracy: 0.9929 - loss: 0.2933 - val_accuracy: 1.0000 - val_loss: 0.2840
Epoch 68/100
40/40 - 0s - 2ms/step - accuracy: 0.9942 - loss: 0.2844 - val_accuracy: 1.0000 - val_loss: 0.2812
Epoch 69/100
40/40 - 0s - 2ms/step - accuracy: 0.9936 - loss: 0.2750 - val_accuracy: 1.0000 - val_loss: 0.2610
Epoch 70/100
40/40 - 0s - 2ms/step - accuracy: 0.9960 - loss: 0.2659 - val_accuracy: 1.0000 - val_loss: 0.2603
Epoch 71/100
40/40 - 0s - 2ms/step - accuracy: 0.9952 - loss: 0.2583 - val_accuracy: 1.0000 - val_loss: 0.2563
Epoch 72/100
40/40 - 0s - 2ms/step - accuracy: 0.9954 - loss: 0.2510 - val_accuracy: 1.0000 - val_loss: 0.2367
Epoch 73/100
40/40 - 0s - 2ms/step - accuracy: 0.9955 - loss: 0.2411 - val_accuracy: 1.0000 - val_loss: 0.2320
Epoch 74/100
40/40 - 0s - 2ms/step - accuracy: 0.9966 - loss: 0.2334 - val_accuracy: 1.0000 - val_loss: 0.2217
Epoch 75/100
40/40 - 0s - 3ms/step - accuracy: 0.9963 - loss: 0.2254 - val_accuracy: 1.0000 - val_loss: 0.2127
Epoch 76/100
40/40 - 0s - 3ms/step - accuracy: 0.9960 - loss: 0.2170 - val_accuracy: 1.0000 - val_loss: 0.2093
Epoch 77/100
40/40 - 0s - 3ms/step - accuracy: 0.9962 - loss: 0.2105 - val_accuracy: 1.0000 - val_loss: 0.2024
Epoch 78/100
40/40 - 0s - 3ms/step - accuracy: 0.9964 - loss: 0.2040 - val_accuracy: 1.0000 - val_loss: 0.1953
Epoch 79/100
40/40 - 0s - 3ms/step - accuracy: 0.9974 - loss: 0.1962 - val_accuracy: 1.0000 - val_loss: 0.1895
Epoch 80/100
40/40 - 0s - 3ms/step - accuracy: 0.9979 - loss: 0.1897 - val_accuracy: 1.0000 - val_loss: 0.1782
Epoch 81/100
40/40 - 0s - 2ms/step - accuracy: 0.9978 - loss: 0.1834 - val_accuracy: 1.0000 - val_loss: 0.1683
Epoch 82/100
40/40 - 0s - 2ms/step - accuracy: 0.9974 - loss: 0.1765 - val_accuracy: 1.0000 - val_loss: 0.1590
Epoch 83/100
40/40 - 0s - 2ms/step - accuracy: 0.9986 - loss: 0.1697 - val_accuracy: 1.0000 - val_loss: 0.1578
Epoch 84/100
40/40 - 0s - 2ms/step - accuracy: 0.9985 - loss: 0.1636 - val_accuracy: 1.0000 - val_loss: 0.1506
Epoch 85/100
40/40 - 0s - 2ms/step - accuracy: 0.9981 - loss: 0.1578 - val_accuracy: 1.0000 - val_loss: 0.1423
Epoch 86/100
40/40 - 0s - 2ms/step - accuracy: 0.9988 - loss: 0.1523 - val_accuracy: 1.0000 - val_loss: 0.1419
Epoch 87/100
40/40 - 0s - 2ms/step - accuracy: 0.9980 - loss: 0.1469 - val_accuracy: 1.0000 - val_loss: 0.1317
Epoch 88/100
40/40 - 0s - 2ms/step - accuracy: 0.9986 - loss: 0.1407 - val_accuracy: 1.0000 - val_loss: 0.1384
Epoch 89/100
40/40 - 0s - 2ms/step - accuracy: 0.9990 - loss: 0.1359 - val_accuracy: 1.0000 - val_loss: 0.1205
Epoch 90/100
40/40 - 0s - 3ms/step - accuracy: 0.9985 - loss: 0.1315 - val_accuracy: 1.0000 - val_loss: 0.1162
Epoch 91/100
40/40 - 0s - 3ms/step - accuracy: 0.9989 - loss: 0.1259 - val_accuracy: 1.0000 - val_loss: 0.1094
Epoch 92/100
40/40 - 0s - 3ms/step - accuracy: 0.9992 - loss: 0.1203 - val_accuracy: 1.0000 - val_loss: 0.1104
Epoch 93/100
40/40 - 0s - 3ms/step - accuracy: 0.9993 - loss: 0.1159 - val_accuracy: 1.0000 - val_loss: 0.1045
Epoch 94/100
40/40 - 0s - 2ms/step - accuracy: 0.9991 - loss: 0.1126 - val_accuracy: 1.0000 - val_loss: 0.1016
Epoch 95/100
40/40 - 0s - 2ms/step - accuracy: 0.9989 - loss: 0.1090 - val_accuracy: 1.0000 - val_loss: 0.0919
Epoch 96/100
40/40 - 0s - 3ms/step - accuracy: 0.9993 - loss: 0.1041 - val_accuracy: 1.0000 - val_loss: 0.0869
Epoch 97/100
40/40 - 0s - 3ms/step - accuracy: 0.9994 - loss: 0.0997 - val_accuracy: 1.0000 - val_loss: 0.0881
Epoch 98/100
40/40 - 0s - 3ms/step - accuracy: 0.9987 - loss: 0.0967 - val_accuracy: 1.0000 - val_loss: 0.0836
Epoch 99/100
40/40 - 0s - 3ms/step - accuracy: 0.9994 - loss: 0.0923 - val_accuracy: 1.0000 - val_loss: 0.0862
Epoch 100/100
40/40 - 0s - 2ms/step - accuracy: 0.9995 - loss: 0.0898 - val_accuracy: 1.0000 - val_loss: 0.0773
<keras.src.callbacks.history.History at 0x7f7bfda7de90>
As we see, the network is essentially perfect now.