Clustering#
Clustering seeks to group data into clusters based on their properties and then allow us to predict which cluster a new member belongs.
import numpy as np
import matplotlib.pyplot as plt
Preparing the data#
We’ll use a dataset generator that is part of scikit-learn called make_moons
. This generates data that falls into 2 different sets with a shape that looks like half-moons.
from sklearn import datasets
def generate_data():
xvec, val = datasets.make_moons(200, noise=0.15)
# encode the output to be 2 elements
x = []
v = []
for xv, vv in zip(xvec, val):
x.append(np.array(xv))
v.append(vv)
return np.array(x), np.array(v)
Tip
By adjusting the noise
parameter, we can blur the boundary between the two datasets, making the classification harder.
x, v = generate_data()
Let’s look at a point and it’s value
print(f"x = {x[0]}, value = {v[0]}")
x = [2.20225073 0.11862178], value = 1
Now let’s plot the data
def plot_data(x, v):
xpt = [q[0] for q in x]
ypt = [q[1] for q in x]
fig, ax = plt.subplots()
ax.scatter(xpt, ypt, s=40, c=v, cmap="viridis")
ax.set_aspect("equal")
return fig
fig = plot_data(x, v)

We want to partition this domain into 2 regions, such that when we come in with a new point, we know which group it belongs to.
Constructing the network#
First we setup and train our network
from keras.models import Sequential
from keras.layers import Input, Dense, Dropout, Activation
from keras.optimizers import RMSprop
2025-04-30 17:08:07.908152: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-04-30 17:08:07.911440: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-04-30 17:08:07.920234: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1746032887.935110 4788 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1746032887.939288 4788 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1746032887.951234 4788 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1746032887.951256 4788 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1746032887.951258 4788 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1746032887.951260 4788 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-04-30 17:08:07.955769: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
model = Sequential()
model.add(Input((2,)))
model.add(Dense(50, activation="relu"))
model.add(Dense(20, activation="relu"))
model.add(Dense(1, activation="sigmoid"))
2025-04-30 17:08:09.920449: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
rms = RMSprop()
model.compile(loss='binary_crossentropy',
optimizer=rms, metrics=['accuracy'])
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 50) │ 150 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 20) │ 1,020 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_2 (Dense) │ (None, 1) │ 21 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 1,191 (4.65 KB)
Trainable params: 1,191 (4.65 KB)
Non-trainable params: 0 (0.00 B)
Training#
Important
We seem to need a lot of epochs here to get a good result
results = model.fit(x, v, batch_size=50, epochs=200, verbose=2)
Epoch 1/200
4/4 - 0s - 122ms/step - accuracy: 0.4700 - loss: 0.6921
Epoch 2/200
4/4 - 0s - 7ms/step - accuracy: 0.7800 - loss: 0.6535
Epoch 3/200
4/4 - 0s - 7ms/step - accuracy: 0.8050 - loss: 0.6254
Epoch 4/200
4/4 - 0s - 7ms/step - accuracy: 0.8350 - loss: 0.6004
Epoch 5/200
4/4 - 0s - 7ms/step - accuracy: 0.8450 - loss: 0.5771
Epoch 6/200
4/4 - 0s - 7ms/step - accuracy: 0.8450 - loss: 0.5547
Epoch 7/200
4/4 - 0s - 8ms/step - accuracy: 0.8450 - loss: 0.5336
Epoch 8/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.5132
Epoch 9/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4942
Epoch 10/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4759
Epoch 11/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4591
Epoch 12/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4436
Epoch 13/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4288
Epoch 14/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4148
Epoch 15/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.4025
Epoch 16/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.3910
Epoch 17/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.3805
Epoch 18/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.3706
Epoch 19/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.3627
Epoch 20/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.3537
Epoch 21/200
4/4 - 0s - 7ms/step - accuracy: 0.8550 - loss: 0.3464
Epoch 22/200
4/4 - 0s - 7ms/step - accuracy: 0.8500 - loss: 0.3396
Epoch 23/200
4/4 - 0s - 7ms/step - accuracy: 0.8550 - loss: 0.3342
Epoch 24/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3281
Epoch 25/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3222
Epoch 26/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3169
Epoch 27/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3124
Epoch 28/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3083
Epoch 29/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3045
Epoch 30/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.3013
Epoch 31/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.2987
Epoch 32/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.2966
Epoch 33/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.2932
Epoch 34/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.2900
Epoch 35/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2900
Epoch 36/200
4/4 - 0s - 7ms/step - accuracy: 0.8650 - loss: 0.2867
Epoch 37/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2841
Epoch 38/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2820
Epoch 39/200
4/4 - 0s - 6ms/step - accuracy: 0.8750 - loss: 0.2795
Epoch 40/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2807
Epoch 41/200
4/4 - 0s - 10ms/step - accuracy: 0.8750 - loss: 0.2763
Epoch 42/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2745
Epoch 43/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2755
Epoch 44/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2711
Epoch 45/200
4/4 - 0s - 6ms/step - accuracy: 0.8750 - loss: 0.2697
Epoch 46/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2687
Epoch 47/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2667
Epoch 48/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2643
Epoch 49/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2632
Epoch 50/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2616
Epoch 51/200
4/4 - 0s - 7ms/step - accuracy: 0.8700 - loss: 0.2620
Epoch 52/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2577
Epoch 53/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2564
Epoch 54/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2550
Epoch 55/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2540
Epoch 56/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2519
Epoch 57/200
4/4 - 0s - 7ms/step - accuracy: 0.8800 - loss: 0.2506
Epoch 58/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2483
Epoch 59/200
4/4 - 0s - 7ms/step - accuracy: 0.8800 - loss: 0.2471
Epoch 60/200
4/4 - 0s - 7ms/step - accuracy: 0.8800 - loss: 0.2452
Epoch 61/200
4/4 - 0s - 7ms/step - accuracy: 0.8800 - loss: 0.2431
Epoch 62/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2414
Epoch 63/200
4/4 - 0s - 7ms/step - accuracy: 0.8800 - loss: 0.2400
Epoch 64/200
4/4 - 0s - 7ms/step - accuracy: 0.8750 - loss: 0.2384
Epoch 65/200
4/4 - 0s - 6ms/step - accuracy: 0.8800 - loss: 0.2354
Epoch 66/200
4/4 - 0s - 6ms/step - accuracy: 0.8750 - loss: 0.2341
Epoch 67/200
4/4 - 0s - 7ms/step - accuracy: 0.8900 - loss: 0.2311
Epoch 68/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2296
Epoch 69/200
4/4 - 0s - 7ms/step - accuracy: 0.9000 - loss: 0.2266
Epoch 70/200
4/4 - 0s - 7ms/step - accuracy: 0.9000 - loss: 0.2258
Epoch 71/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2239
Epoch 72/200
4/4 - 0s - 7ms/step - accuracy: 0.9000 - loss: 0.2212
Epoch 73/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2186
Epoch 74/200
4/4 - 0s - 8ms/step - accuracy: 0.9000 - loss: 0.2172
Epoch 75/200
4/4 - 0s - 9ms/step - accuracy: 0.9050 - loss: 0.2156
Epoch 76/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2117
Epoch 77/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2104
Epoch 78/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2084
Epoch 79/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.2071
Epoch 80/200
4/4 - 0s - 7ms/step - accuracy: 0.8950 - loss: 0.2053
Epoch 81/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.2026
Epoch 82/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.1995
Epoch 83/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.1988
Epoch 84/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.1976
Epoch 85/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.1930
Epoch 86/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.1912
Epoch 87/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.1889
Epoch 88/200
4/4 - 0s - 7ms/step - accuracy: 0.9100 - loss: 0.1871
Epoch 89/200
4/4 - 0s - 7ms/step - accuracy: 0.9150 - loss: 0.1845
Epoch 90/200
4/4 - 0s - 7ms/step - accuracy: 0.9050 - loss: 0.1830
Epoch 91/200
4/4 - 0s - 7ms/step - accuracy: 0.9200 - loss: 0.1804
Epoch 92/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1785
Epoch 93/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1765
Epoch 94/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1749
Epoch 95/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1717
Epoch 96/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1720
Epoch 97/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1673
Epoch 98/200
4/4 - 0s - 7ms/step - accuracy: 0.9250 - loss: 0.1670
Epoch 99/200
4/4 - 0s - 7ms/step - accuracy: 0.9300 - loss: 0.1635
Epoch 100/200
4/4 - 0s - 7ms/step - accuracy: 0.9350 - loss: 0.1616
Epoch 101/200
4/4 - 0s - 7ms/step - accuracy: 0.9300 - loss: 0.1597
Epoch 102/200
4/4 - 0s - 7ms/step - accuracy: 0.9300 - loss: 0.1575
Epoch 103/200
4/4 - 0s - 7ms/step - accuracy: 0.9350 - loss: 0.1562
Epoch 104/200
4/4 - 0s - 7ms/step - accuracy: 0.9350 - loss: 0.1546
Epoch 105/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1517
Epoch 106/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1499
Epoch 107/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1474
Epoch 108/200
4/4 - 0s - 9ms/step - accuracy: 0.9350 - loss: 0.1481
Epoch 109/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1434
Epoch 110/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1440
Epoch 111/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1403
Epoch 112/200
4/4 - 0s - 6ms/step - accuracy: 0.9400 - loss: 0.1388
Epoch 113/200
4/4 - 0s - 6ms/step - accuracy: 0.9450 - loss: 0.1381
Epoch 114/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1344
Epoch 115/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1331
Epoch 116/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1320
Epoch 117/200
4/4 - 0s - 6ms/step - accuracy: 0.9400 - loss: 0.1300
Epoch 118/200
4/4 - 0s - 6ms/step - accuracy: 0.9450 - loss: 0.1278
Epoch 119/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1257
Epoch 120/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1247
Epoch 121/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1238
Epoch 122/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1207
Epoch 123/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1190
Epoch 124/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1186
Epoch 125/200
4/4 - 0s - 7ms/step - accuracy: 0.9400 - loss: 0.1156
Epoch 126/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1149
Epoch 127/200
4/4 - 0s - 7ms/step - accuracy: 0.9500 - loss: 0.1122
Epoch 128/200
4/4 - 0s - 7ms/step - accuracy: 0.9500 - loss: 0.1111
Epoch 129/200
4/4 - 0s - 7ms/step - accuracy: 0.9500 - loss: 0.1096
Epoch 130/200
4/4 - 0s - 7ms/step - accuracy: 0.9450 - loss: 0.1082
Epoch 131/200
4/4 - 0s - 7ms/step - accuracy: 0.9500 - loss: 0.1065
Epoch 132/200
4/4 - 0s - 7ms/step - accuracy: 0.9500 - loss: 0.1045
Epoch 133/200
4/4 - 0s - 7ms/step - accuracy: 0.9550 - loss: 0.1035
Epoch 134/200
4/4 - 0s - 6ms/step - accuracy: 0.9550 - loss: 0.1026
Epoch 135/200
4/4 - 0s - 7ms/step - accuracy: 0.9500 - loss: 0.1002
Epoch 136/200
4/4 - 0s - 7ms/step - accuracy: 0.9600 - loss: 0.0995
Epoch 137/200
4/4 - 0s - 7ms/step - accuracy: 0.9550 - loss: 0.0994
Epoch 138/200
4/4 - 0s - 7ms/step - accuracy: 0.9550 - loss: 0.0968
Epoch 139/200
4/4 - 0s - 7ms/step - accuracy: 0.9650 - loss: 0.0951
Epoch 140/200
4/4 - 0s - 7ms/step - accuracy: 0.9600 - loss: 0.0946
Epoch 141/200
4/4 - 0s - 8ms/step - accuracy: 0.9650 - loss: 0.0929
Epoch 142/200
4/4 - 0s - 10ms/step - accuracy: 0.9700 - loss: 0.0912
Epoch 143/200
4/4 - 0s - 7ms/step - accuracy: 0.9600 - loss: 0.0920
Epoch 144/200
4/4 - 0s - 7ms/step - accuracy: 0.9650 - loss: 0.0893
Epoch 145/200
4/4 - 0s - 7ms/step - accuracy: 0.9650 - loss: 0.0885
Epoch 146/200
4/4 - 0s - 7ms/step - accuracy: 0.9700 - loss: 0.0876
Epoch 147/200
4/4 - 0s - 7ms/step - accuracy: 0.9700 - loss: 0.0863
Epoch 148/200
4/4 - 0s - 7ms/step - accuracy: 0.9800 - loss: 0.0850
Epoch 149/200
4/4 - 0s - 7ms/step - accuracy: 0.9750 - loss: 0.0846
Epoch 150/200
4/4 - 0s - 7ms/step - accuracy: 0.9800 - loss: 0.0829
Epoch 151/200
4/4 - 0s - 7ms/step - accuracy: 0.9700 - loss: 0.0813
Epoch 152/200
4/4 - 0s - 7ms/step - accuracy: 0.9750 - loss: 0.0807
Epoch 153/200
4/4 - 0s - 7ms/step - accuracy: 0.9700 - loss: 0.0822
Epoch 154/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0789
Epoch 155/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0793
Epoch 156/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0772
Epoch 157/200
4/4 - 0s - 6ms/step - accuracy: 0.9750 - loss: 0.0762
Epoch 158/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0762
Epoch 159/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0742
Epoch 160/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0740
Epoch 161/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0739
Epoch 162/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0719
Epoch 163/200
4/4 - 0s - 8ms/step - accuracy: 0.9850 - loss: 0.0710
Epoch 164/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0710
Epoch 165/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0700
Epoch 166/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0681
Epoch 167/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0673
Epoch 168/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0684
Epoch 169/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0665
Epoch 170/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0655
Epoch 171/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0651
Epoch 172/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0656
Epoch 173/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0639
Epoch 174/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0626
Epoch 175/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0623
Epoch 176/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0621
Epoch 177/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0615
Epoch 178/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0615
Epoch 179/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0592
Epoch 180/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0597
Epoch 181/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0584
Epoch 182/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0577
Epoch 183/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0574
Epoch 184/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0565
Epoch 185/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0579
Epoch 186/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0564
Epoch 187/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0548
Epoch 188/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0549
Epoch 189/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0545
Epoch 190/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0535
Epoch 191/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0528
Epoch 192/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0521
Epoch 193/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0522
Epoch 194/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0520
Epoch 195/200
4/4 - 0s - 6ms/step - accuracy: 0.9850 - loss: 0.0523
Epoch 196/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0503
Epoch 197/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0493
Epoch 198/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0502
Epoch 199/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0494
Epoch 200/200
4/4 - 0s - 7ms/step - accuracy: 0.9850 - loss: 0.0499
score = model.evaluate(x, v, verbose=0)
print(f"score = {score[0]}")
print(f"accuracy = {score[1]}")
score = 0.047401219606399536
accuracy = 0.9850000143051147
Predicting#
Let’s look at a prediction. We need to feed in a single point as an array of shape (N, 2)
, where N
is the number of points
res = model.predict(np.array([[-2, 2]]))
res
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step
array([[1.335973e-18]], dtype=float32)
We see that we get a floating point number. We will need to convert this to 0 or 1 by rounding.
Let’s plot the partitioning
M = 256
N = 256
xmin = -1.75
xmax = 2.5
ymin = -1.25
ymax = 1.75
xpt = np.linspace(xmin, xmax, M)
ypt = np.linspace(ymin, ymax, N)
To make the prediction go faster, we want to feed in a vector of these points, of the form:
[[xpt[0], ypt[0]],
[xpt[1], ypt[1]],
...
]
We can see that this packs them into the vector
pairs = np.array(np.meshgrid(xpt, ypt)).T.reshape(-1, 2)
pairs[0]
array([-1.75, -1.25])
Now we do the prediction. We will get a vector out, which we reshape to match the original domain.
res = model.predict(pairs, verbose=0)
res.shape = (M, N)
Finally, round to 0 or 1
domain = np.where(res > 0.5, 1, 0)
and we can plot the data
fig, ax = plt.subplots()
ax.imshow(domain.T, origin="lower",
extent=[xmin, xmax, ymin, ymax], alpha=0.25)
xpt = [q[0] for q in x]
ypt = [q[1] for q in x]
ax.scatter(xpt, ypt, s=40, c=v, cmap="viridis")
<matplotlib.collections.PathCollection at 0x7f2c88349450>
