Peer Herholz (he/him)
Research Assistant Professor - Northwestern University
Research affiliate - NeuroDataScience lab at MNI/McGill,
UNIQUE,
Max-Planck-Institute for Human Cognitive and Brain Sciences
Member - BIDS, ReproNim, Brainhack, Neuromod, OHBM SEA-SIG
@peerherholz
Let's imagine the following scenario:
Your PI tells you to run some machine learning
analyses on your data (because buzz words and we all need top tier publications and that sweet grant money). Specifically, you should use resting state connectivity
data to predict the age
of participants
(sounds familiar, eh?). So you go ahead, gather some data, apply a random forest
(a form of decision tree
) and ...
Whoa, let's go back a few steps and actually briefly define what we're talking about here, starting with machine learning
.
and also its components
& steps
:
Now, let's actually check how this looks like. At first we get the data
:
import urllib.request
url = 'https://www.dropbox.com/s/v48f8pjfw4u2bxi/MAIN_BASC064_subsamp_features.npz?dl=1'
urllib.request.urlretrieve(url, 'MAIN2019_BASC064_subsamp_features.npz')
('MAIN2019_BASC064_subsamp_features.npz', <http.client.HTTPMessage at 0x10823ff10>)
and then inspect it:
import numpy as np
data = np.load('MAIN2019_BASC064_subsamp_features.npz')['a']
data.shape
(155, 2016)
We will also visualize
it to better grasp what's going on:
import plotly.express as px
from IPython.display import display, HTML
from plotly.offline import init_notebook_mode, plot
fig = px.imshow(data, labels=dict(x="features (whole brain connectome connections)", y="participants"),
height=800, aspect='None')
fig.update(layout_coloraxis_showscale=False)
init_notebook_mode(connected=True)
fig.show()
#plot(fig, filename = 'input_data.html')
#display(HTML('input_data.html'))
Beside the input data
we also need our labels
:
url = 'https://www.dropbox.com/s/ofsqdcukyde4lke/participants.csv?dl=1'
urllib.request.urlretrieve(url, 'participants.csv')
('participants.csv', <http.client.HTTPMessage at 0x117c13fd0>)
Which we then load
and check
as well:
import pandas as pd
labels = pd.read_csv('participants.csv')['AgeGroup']
labels.describe()
count 155 unique 6 top 5yo freq 34 Name: AgeGroup, dtype: object
For a better intuition, we’re going to also visualize
the labels
and their distribution
:
fig = px.histogram(labels, marginal='box', template='plotly_white')
fig.update_layout(showlegend=False, width=800, height=600)
init_notebook_mode(connected=True)
fig.show()
#plot(fig, filename = 'labels.html')
#display(HTML('labels.html'))
And we’re ready to create our machine learning analysis pipeline
using scikit-learn within we will scale
our input data
, train
a Random Forest and test its predictive performance
. We import the required functions
and classes
:
from sklearn.preprocessing import StandardScaler
from sklearn.ensemble import RandomForestClassifier
from sklearn.pipeline import make_pipeline
from sklearn.model_selection import GroupShuffleSplit, cross_validate, cross_val_score
from sklearn.metrics import accuracy_score
and setup an scikit-learn
pipeline:
pipe = make_pipeline(
StandardScaler(),
RandomForestClassifier()
)
That's all we need to run the analysis
, computing accuracy
and mean absolute error
:
acc_val_rf = cross_validate(pipe, data, pd.Categorical(labels).codes, cv=10, return_estimator =True)
acc_rf = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=10)
mae_rf = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=10,
scoring='neg_mean_absolute_error')
which we can then inspect for each CV fold
:
for i in range(10):
print(
'Fold {} -- Acc = {}, MAE = {}'.format(i, np.round(acc_rf[i], 3), np.round(-mae_rf[i], 3))
)
Fold 0 -- Acc = 0.438, MAE = 1.062 Fold 1 -- Acc = 0.562, MAE = 0.875 Fold 2 -- Acc = 0.5, MAE = 1.5 Fold 3 -- Acc = 0.562, MAE = 1.062 Fold 4 -- Acc = 0.562, MAE = 1.188 Fold 5 -- Acc = 0.533, MAE = 1.267 Fold 6 -- Acc = 0.6, MAE = 0.8 Fold 7 -- Acc = 0.667, MAE = 0.933 Fold 8 -- Acc = 0.467, MAE = 0.533 Fold 9 -- Acc = 0.4, MAE = 1.333
and overall:
print('Accuracy = {}, MAE = {}, Chance = {}'.format(np.round(np.mean(acc_rf), 3),
np.round(np.mean(-mae_rf), 3),
np.round(1/len(labels.unique()), 3)))
Accuracy = 0.529, MAE = 1.055, Chance = 0.167
That's a pretty good performance, eh? The amazing power of machine learning
! But there's more: you also try out an ANN
to see if it's providing even better predictions
.
That's as easy as the "basic machine learning pipeline
". Just import
the respective functions
and classes
:
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
2025-06-17 17:46:51.623493: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
Define a simple ANN
with 4 layers
:
model = keras.Sequential()
model.add(layers.Input(shape=data[1].shape))
model.add(layers.Dense(100, activation="relu", kernel_initializer='he_normal', bias_initializer='zeros'))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(50, activation="relu"))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(25, activation="relu"))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(len(labels.unique()), activation='softmax'))
model.compile(loss='sparse_categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
Split the data
into train
and test
again:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(data, pd.Categorical(labels).codes, test_size=0.2, shuffle=True)
and train
your model
:
%time fit = model.fit(X_train, y_train, epochs=300, batch_size=20, validation_split=0.2)
Epoch 1/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 5s 123ms/step - accuracy: 0.1772 - loss: 2.7513 - val_accuracy: 0.0800 - val_loss: 1.7977 Epoch 2/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.1637 - loss: 2.6742 - val_accuracy: 0.1600 - val_loss: 1.8318 Epoch 3/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.1447 - loss: 2.4660 - val_accuracy: 0.2000 - val_loss: 1.8271 Epoch 4/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.2952 - loss: 2.2985 - val_accuracy: 0.2000 - val_loss: 1.8011 Epoch 5/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.2878 - loss: 2.2132 - val_accuracy: 0.2000 - val_loss: 1.7646 Epoch 6/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.2835 - loss: 1.9437 - val_accuracy: 0.2800 - val_loss: 1.7387 Epoch 7/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.3140 - loss: 1.8654 - val_accuracy: 0.2800 - val_loss: 1.7189 Epoch 8/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.2794 - loss: 1.9797 - val_accuracy: 0.2800 - val_loss: 1.6999 Epoch 9/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.3888 - loss: 1.9332 - val_accuracy: 0.3200 - val_loss: 1.6898 Epoch 10/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.4483 - loss: 1.7435 - val_accuracy: 0.3200 - val_loss: 1.6741 Epoch 11/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.3392 - loss: 1.7879 - val_accuracy: 0.3200 - val_loss: 1.6630 Epoch 12/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.2581 - loss: 1.9323 - val_accuracy: 0.3200 - val_loss: 1.6404 Epoch 13/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.3007 - loss: 1.8283 - val_accuracy: 0.3200 - val_loss: 1.6138 Epoch 14/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.3844 - loss: 1.6284 - val_accuracy: 0.3600 - val_loss: 1.5779 Epoch 15/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5311 - loss: 1.4283 - val_accuracy: 0.4000 - val_loss: 1.5459 Epoch 16/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.3965 - loss: 1.6158 - val_accuracy: 0.4400 - val_loss: 1.5186 Epoch 17/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.5235 - loss: 1.4134 - val_accuracy: 0.4800 - val_loss: 1.4948 Epoch 18/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.4029 - loss: 1.6265 - val_accuracy: 0.5600 - val_loss: 1.4720 Epoch 19/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4500 - loss: 1.5120 - val_accuracy: 0.6000 - val_loss: 1.4552 Epoch 20/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.3627 - loss: 1.5953 - val_accuracy: 0.6000 - val_loss: 1.4316 Epoch 21/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.4187 - loss: 1.6154 - val_accuracy: 0.5600 - val_loss: 1.4201 Epoch 22/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.3766 - loss: 1.6040 - val_accuracy: 0.5600 - val_loss: 1.4117 Epoch 23/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.2851 - loss: 1.5908 - val_accuracy: 0.5600 - val_loss: 1.4040 Epoch 24/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4138 - loss: 1.4599 - val_accuracy: 0.5600 - val_loss: 1.3880 Epoch 25/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4581 - loss: 1.2933 - val_accuracy: 0.5200 - val_loss: 1.3633 Epoch 26/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4581 - loss: 1.2729 - val_accuracy: 0.5200 - val_loss: 1.3401 Epoch 27/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5618 - loss: 1.1328 - val_accuracy: 0.5200 - val_loss: 1.3261 Epoch 28/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.5874 - loss: 1.2366 - val_accuracy: 0.6000 - val_loss: 1.3088 Epoch 29/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5983 - loss: 1.1358 - val_accuracy: 0.6000 - val_loss: 1.2960 Epoch 30/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.5892 - loss: 1.1366 - val_accuracy: 0.6000 - val_loss: 1.2988 Epoch 31/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.5724 - loss: 1.1277 - val_accuracy: 0.6400 - val_loss: 1.2973 Epoch 32/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.5227 - loss: 1.1174 - val_accuracy: 0.6400 - val_loss: 1.2808 Epoch 33/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.5585 - loss: 1.0932 - val_accuracy: 0.6400 - val_loss: 1.2613 Epoch 34/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5715 - loss: 1.0680 - val_accuracy: 0.6400 - val_loss: 1.2398 Epoch 35/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.5610 - loss: 1.0196 - val_accuracy: 0.6400 - val_loss: 1.2213 Epoch 36/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.5802 - loss: 1.1671 - val_accuracy: 0.6400 - val_loss: 1.2033 Epoch 37/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.6257 - loss: 0.9556 - val_accuracy: 0.6400 - val_loss: 1.1890 Epoch 38/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.6424 - loss: 0.9940 - val_accuracy: 0.6400 - val_loss: 1.1815 Epoch 39/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.5371 - loss: 1.0191 - val_accuracy: 0.6400 - val_loss: 1.1647 Epoch 40/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5751 - loss: 1.2073 - val_accuracy: 0.6800 - val_loss: 1.1485 Epoch 41/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.6775 - loss: 0.9144 - val_accuracy: 0.6000 - val_loss: 1.1134 Epoch 42/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6668 - loss: 0.9838 - val_accuracy: 0.6000 - val_loss: 1.0894 Epoch 43/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.7303 - loss: 0.8039 - val_accuracy: 0.6000 - val_loss: 1.0771 Epoch 44/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.6431 - loss: 0.8842 - val_accuracy: 0.6000 - val_loss: 1.0863 Epoch 45/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6926 - loss: 0.8289 - val_accuracy: 0.5600 - val_loss: 1.0954 Epoch 46/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.6555 - loss: 1.0113 - val_accuracy: 0.5200 - val_loss: 1.0950 Epoch 47/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.6919 - loss: 0.7848 - val_accuracy: 0.5200 - val_loss: 1.0961 Epoch 48/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7216 - loss: 0.7579 - val_accuracy: 0.5600 - val_loss: 1.0868 Epoch 49/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.6602 - loss: 0.8519 - val_accuracy: 0.6000 - val_loss: 1.0731 Epoch 50/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5469 - loss: 1.0536 - val_accuracy: 0.6000 - val_loss: 1.0624 Epoch 51/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7789 - loss: 0.7038 - val_accuracy: 0.6400 - val_loss: 1.0573 Epoch 52/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8068 - loss: 0.6971 - val_accuracy: 0.6400 - val_loss: 1.0518 Epoch 53/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.6393 - loss: 0.8736 - val_accuracy: 0.7200 - val_loss: 1.0535 Epoch 54/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7366 - loss: 0.7970 - val_accuracy: 0.6800 - val_loss: 1.0544 Epoch 55/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.7495 - loss: 0.7462 - val_accuracy: 0.6400 - val_loss: 1.0599 Epoch 56/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8051 - loss: 0.6466 - val_accuracy: 0.6800 - val_loss: 1.0556 Epoch 57/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7738 - loss: 0.7408 - val_accuracy: 0.7200 - val_loss: 1.0435 Epoch 58/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.8215 - loss: 0.6844 - val_accuracy: 0.7200 - val_loss: 1.0049 Epoch 59/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.7712 - loss: 0.6838 - val_accuracy: 0.7200 - val_loss: 0.9766 Epoch 60/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7499 - loss: 0.6911 - val_accuracy: 0.6800 - val_loss: 0.9532 Epoch 61/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7701 - loss: 0.6356 - val_accuracy: 0.6800 - val_loss: 0.9267 Epoch 62/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7946 - loss: 0.6589 - val_accuracy: 0.6800 - val_loss: 0.9164 Epoch 63/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.8128 - loss: 0.5685 - val_accuracy: 0.7200 - val_loss: 0.9129 Epoch 64/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7485 - loss: 0.8008 - val_accuracy: 0.7200 - val_loss: 0.9032 Epoch 65/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7494 - loss: 0.6753 - val_accuracy: 0.6800 - val_loss: 0.9114 Epoch 66/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8256 - loss: 0.5654 - val_accuracy: 0.6800 - val_loss: 0.9180 Epoch 67/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.7361 - loss: 0.7065 - val_accuracy: 0.6400 - val_loss: 0.9252 Epoch 68/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.7492 - loss: 0.6729 - val_accuracy: 0.7200 - val_loss: 0.9019 Epoch 69/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7291 - loss: 0.6714 - val_accuracy: 0.7200 - val_loss: 0.8647 Epoch 70/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.7634 - loss: 0.6605 - val_accuracy: 0.7600 - val_loss: 0.8531 Epoch 71/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7353 - loss: 0.6661 - val_accuracy: 0.7200 - val_loss: 0.8516 Epoch 72/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7536 - loss: 0.6632 - val_accuracy: 0.7200 - val_loss: 0.8419 Epoch 73/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7127 - loss: 0.7028 - val_accuracy: 0.7200 - val_loss: 0.8314 Epoch 74/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7630 - loss: 0.6544 - val_accuracy: 0.7200 - val_loss: 0.8372 Epoch 75/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8218 - loss: 0.5730 - val_accuracy: 0.7200 - val_loss: 0.8526 Epoch 76/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7896 - loss: 0.6575 - val_accuracy: 0.7200 - val_loss: 0.8492 Epoch 77/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8095 - loss: 0.6107 - val_accuracy: 0.7200 - val_loss: 0.8284 Epoch 78/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8321 - loss: 0.4874 - val_accuracy: 0.7200 - val_loss: 0.7997 Epoch 79/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.8877 - loss: 0.4820 - val_accuracy: 0.6800 - val_loss: 0.7995 Epoch 80/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.7737 - loss: 0.5459 - val_accuracy: 0.6800 - val_loss: 0.8253 Epoch 81/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.8095 - loss: 0.5619 - val_accuracy: 0.6800 - val_loss: 0.8554 Epoch 82/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.8461 - loss: 0.5115 - val_accuracy: 0.6800 - val_loss: 0.8761 Epoch 83/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.8219 - loss: 0.4868 - val_accuracy: 0.6400 - val_loss: 0.8954 Epoch 84/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.8304 - loss: 0.4401 - val_accuracy: 0.6400 - val_loss: 0.9483 Epoch 85/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.8081 - loss: 0.5562 - val_accuracy: 0.6000 - val_loss: 0.9943 Epoch 86/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.8561 - loss: 0.4915 - val_accuracy: 0.5600 - val_loss: 1.0129 Epoch 87/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8063 - loss: 0.6314 - val_accuracy: 0.5600 - val_loss: 1.0145 Epoch 88/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8174 - loss: 0.5143 - val_accuracy: 0.5600 - val_loss: 1.0168 Epoch 89/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8812 - loss: 0.4885 - val_accuracy: 0.6000 - val_loss: 1.0476 Epoch 90/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8504 - loss: 0.4935 - val_accuracy: 0.5600 - val_loss: 1.0384 Epoch 91/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7496 - loss: 0.6084 - val_accuracy: 0.5600 - val_loss: 1.0454 Epoch 92/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8259 - loss: 0.6066 - val_accuracy: 0.6000 - val_loss: 1.0338 Epoch 93/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8458 - loss: 0.5114 - val_accuracy: 0.5600 - val_loss: 1.0359 Epoch 94/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.8437 - loss: 0.4614 - val_accuracy: 0.6000 - val_loss: 1.0287 Epoch 95/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9089 - loss: 0.4450 - val_accuracy: 0.5600 - val_loss: 1.0255 Epoch 96/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8757 - loss: 0.4630 - val_accuracy: 0.5200 - val_loss: 1.0464 Epoch 97/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8430 - loss: 0.4633 - val_accuracy: 0.4800 - val_loss: 1.0547 Epoch 98/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8742 - loss: 0.4082 - val_accuracy: 0.5200 - val_loss: 1.0557 Epoch 99/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9320 - loss: 0.3964 - val_accuracy: 0.6000 - val_loss: 1.0298 Epoch 100/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8191 - loss: 0.5101 - val_accuracy: 0.6000 - val_loss: 1.0125 Epoch 101/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8868 - loss: 0.4295 - val_accuracy: 0.6000 - val_loss: 0.9926 Epoch 102/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8647 - loss: 0.4807 - val_accuracy: 0.6000 - val_loss: 0.9723 Epoch 103/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8227 - loss: 0.5504 - val_accuracy: 0.5600 - val_loss: 0.9680 Epoch 104/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.8648 - loss: 0.3469 - val_accuracy: 0.5600 - val_loss: 1.0258 Epoch 105/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.8984 - loss: 0.3746 - val_accuracy: 0.5600 - val_loss: 1.0483 Epoch 106/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7998 - loss: 0.5404 - val_accuracy: 0.5600 - val_loss: 1.0162 Epoch 107/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7889 - loss: 0.5082 - val_accuracy: 0.6000 - val_loss: 0.9596 Epoch 108/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.7954 - loss: 0.4958 - val_accuracy: 0.6400 - val_loss: 0.9244 Epoch 109/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8731 - loss: 0.4517 - val_accuracy: 0.6400 - val_loss: 0.9351 Epoch 110/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9017 - loss: 0.3787 - val_accuracy: 0.6400 - val_loss: 0.9415 Epoch 111/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8913 - loss: 0.3984 - val_accuracy: 0.6800 - val_loss: 0.9310 Epoch 112/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9492 - loss: 0.3461 - val_accuracy: 0.6800 - val_loss: 0.9287 Epoch 113/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8451 - loss: 0.4681 - val_accuracy: 0.6800 - val_loss: 0.9097 Epoch 114/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.8943 - loss: 0.4140 - val_accuracy: 0.6800 - val_loss: 0.9044 Epoch 115/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8130 - loss: 0.4715 - val_accuracy: 0.6800 - val_loss: 0.9075 Epoch 116/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9051 - loss: 0.3575 - val_accuracy: 0.6800 - val_loss: 0.9037 Epoch 117/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7668 - loss: 0.4677 - val_accuracy: 0.6800 - val_loss: 0.8970 Epoch 118/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9153 - loss: 0.3547 - val_accuracy: 0.6800 - val_loss: 0.8921 Epoch 119/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8774 - loss: 0.3541 - val_accuracy: 0.6800 - val_loss: 0.8671 Epoch 120/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7839 - loss: 0.5486 - val_accuracy: 0.6800 - val_loss: 0.8425 Epoch 121/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9037 - loss: 0.3771 - val_accuracy: 0.7200 - val_loss: 0.7855 Epoch 122/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9341 - loss: 0.3761 - val_accuracy: 0.7200 - val_loss: 0.7622 Epoch 123/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8313 - loss: 0.4784 - val_accuracy: 0.7200 - val_loss: 0.7514 Epoch 124/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9209 - loss: 0.3360 - val_accuracy: 0.7200 - val_loss: 0.7394 Epoch 125/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8948 - loss: 0.3435 - val_accuracy: 0.7200 - val_loss: 0.7264 Epoch 126/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8412 - loss: 0.4011 - val_accuracy: 0.6800 - val_loss: 0.7237 Epoch 127/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8431 - loss: 0.3810 - val_accuracy: 0.6800 - val_loss: 0.7309 Epoch 128/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9693 - loss: 0.2410 - val_accuracy: 0.6800 - val_loss: 0.7422 Epoch 129/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8197 - loss: 0.4111 - val_accuracy: 0.6800 - val_loss: 0.7478 Epoch 130/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9651 - loss: 0.2351 - val_accuracy: 0.6800 - val_loss: 0.7547 Epoch 131/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9149 - loss: 0.3156 - val_accuracy: 0.6800 - val_loss: 0.7638 Epoch 132/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9092 - loss: 0.3236 - val_accuracy: 0.6800 - val_loss: 0.7774 Epoch 133/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9022 - loss: 0.3351 - val_accuracy: 0.6800 - val_loss: 0.7838 Epoch 134/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9255 - loss: 0.2611 - val_accuracy: 0.7200 - val_loss: 0.7795 Epoch 135/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9532 - loss: 0.2641 - val_accuracy: 0.6800 - val_loss: 0.7739 Epoch 136/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9291 - loss: 0.3012 - val_accuracy: 0.6800 - val_loss: 0.7709 Epoch 137/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8692 - loss: 0.3155 - val_accuracy: 0.6800 - val_loss: 0.7739 Epoch 138/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9340 - loss: 0.2040 - val_accuracy: 0.6800 - val_loss: 0.7759 Epoch 139/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9700 - loss: 0.1870 - val_accuracy: 0.7200 - val_loss: 0.7776 Epoch 140/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9129 - loss: 0.2664 - val_accuracy: 0.7200 - val_loss: 0.7917 Epoch 141/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.8569 - loss: 0.3985 - val_accuracy: 0.7200 - val_loss: 0.8020 Epoch 142/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9405 - loss: 0.2604 - val_accuracy: 0.6800 - val_loss: 0.8247 Epoch 143/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.8755 - loss: 0.3260 - val_accuracy: 0.6800 - val_loss: 0.8394 Epoch 144/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9063 - loss: 0.2936 - val_accuracy: 0.6800 - val_loss: 0.8589 Epoch 145/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9512 - loss: 0.2065 - val_accuracy: 0.6800 - val_loss: 0.8632 Epoch 146/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.8704 - loss: 0.3388 - val_accuracy: 0.6800 - val_loss: 0.8728 Epoch 147/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.8993 - loss: 0.3323 - val_accuracy: 0.6800 - val_loss: 0.8993 Epoch 148/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.8136 - loss: 0.3787 - val_accuracy: 0.6800 - val_loss: 0.9645 Epoch 149/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9460 - loss: 0.2426 - val_accuracy: 0.6000 - val_loss: 1.0498 Epoch 150/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9373 - loss: 0.2840 - val_accuracy: 0.6000 - val_loss: 1.1166 Epoch 151/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.8494 - loss: 0.4648 - val_accuracy: 0.6000 - val_loss: 1.1434 Epoch 152/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9547 - loss: 0.2184 - val_accuracy: 0.6000 - val_loss: 1.1332 Epoch 153/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9258 - loss: 0.2883 - val_accuracy: 0.6000 - val_loss: 1.1108 Epoch 154/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9519 - loss: 0.2997 - val_accuracy: 0.6000 - val_loss: 1.0560 Epoch 155/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9174 - loss: 0.2414 - val_accuracy: 0.6400 - val_loss: 0.9705 Epoch 156/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9842 - loss: 0.2085 - val_accuracy: 0.6400 - val_loss: 0.9280 Epoch 157/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9159 - loss: 0.1888 - val_accuracy: 0.6400 - val_loss: 0.9375 Epoch 158/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9711 - loss: 0.2359 - val_accuracy: 0.6400 - val_loss: 0.9731 Epoch 159/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9767 - loss: 0.1685 - val_accuracy: 0.6400 - val_loss: 1.0009 Epoch 160/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9182 - loss: 0.2629 - val_accuracy: 0.6400 - val_loss: 1.0089 Epoch 161/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9445 - loss: 0.2349 - val_accuracy: 0.6400 - val_loss: 1.0081 Epoch 162/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9017 - loss: 0.2919 - val_accuracy: 0.6400 - val_loss: 1.0142 Epoch 163/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9034 - loss: 0.2646 - val_accuracy: 0.6400 - val_loss: 0.9902 Epoch 164/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.8926 - loss: 0.3683 - val_accuracy: 0.6400 - val_loss: 0.9572 Epoch 165/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9175 - loss: 0.2830 - val_accuracy: 0.6400 - val_loss: 0.9542 Epoch 166/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9412 - loss: 0.2244 - val_accuracy: 0.6400 - val_loss: 0.9711 Epoch 167/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9166 - loss: 0.3995 - val_accuracy: 0.6800 - val_loss: 0.9858 Epoch 168/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9477 - loss: 0.2016 - val_accuracy: 0.6800 - val_loss: 0.9794 Epoch 169/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9490 - loss: 0.2377 - val_accuracy: 0.6800 - val_loss: 0.9729 Epoch 170/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9706 - loss: 0.1854 - val_accuracy: 0.6800 - val_loss: 0.9803 Epoch 171/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9562 - loss: 0.1973 - val_accuracy: 0.6800 - val_loss: 1.0029 Epoch 172/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9700 - loss: 0.1371 - val_accuracy: 0.6800 - val_loss: 1.0113 Epoch 173/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9461 - loss: 0.2360 - val_accuracy: 0.6400 - val_loss: 1.0066 Epoch 174/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9375 - loss: 0.1861 - val_accuracy: 0.6400 - val_loss: 0.9992 Epoch 175/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.8762 - loss: 0.2695 - val_accuracy: 0.6800 - val_loss: 1.0023 Epoch 176/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9263 - loss: 0.2599 - val_accuracy: 0.6800 - val_loss: 1.0022 Epoch 177/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9218 - loss: 0.2806 - val_accuracy: 0.6800 - val_loss: 1.0048 Epoch 178/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9395 - loss: 0.1944 - val_accuracy: 0.6800 - val_loss: 0.9972 Epoch 179/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9493 - loss: 0.2034 - val_accuracy: 0.6800 - val_loss: 0.9737 Epoch 180/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8004 - loss: 0.4326 - val_accuracy: 0.6800 - val_loss: 0.9698 Epoch 181/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9569 - loss: 0.2538 - val_accuracy: 0.6400 - val_loss: 0.9254 Epoch 182/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9725 - loss: 0.1656 - val_accuracy: 0.6400 - val_loss: 0.9090 Epoch 183/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9498 - loss: 0.2122 - val_accuracy: 0.6000 - val_loss: 0.9085 Epoch 184/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9542 - loss: 0.1859 - val_accuracy: 0.6000 - val_loss: 0.9054 Epoch 185/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9884 - loss: 0.1414 - val_accuracy: 0.6400 - val_loss: 0.9230 Epoch 186/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9389 - loss: 0.2111 - val_accuracy: 0.6400 - val_loss: 0.9680 Epoch 187/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9670 - loss: 0.1651 - val_accuracy: 0.6000 - val_loss: 1.0490 Epoch 188/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9493 - loss: 0.1902 - val_accuracy: 0.5600 - val_loss: 1.0592 Epoch 189/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9262 - loss: 0.3474 - val_accuracy: 0.5600 - val_loss: 1.0579 Epoch 190/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9283 - loss: 0.2600 - val_accuracy: 0.5600 - val_loss: 1.0809 Epoch 191/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9644 - loss: 0.1741 - val_accuracy: 0.5600 - val_loss: 1.0929 Epoch 192/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9721 - loss: 0.1497 - val_accuracy: 0.5600 - val_loss: 1.1117 Epoch 193/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9842 - loss: 0.1281 - val_accuracy: 0.5200 - val_loss: 1.1596 Epoch 194/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9428 - loss: 0.1917 - val_accuracy: 0.5200 - val_loss: 1.2060 Epoch 195/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9218 - loss: 0.2193 - val_accuracy: 0.5200 - val_loss: 1.2219 Epoch 196/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9809 - loss: 0.1424 - val_accuracy: 0.5200 - val_loss: 1.2030 Epoch 197/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9249 - loss: 0.2332 - val_accuracy: 0.5200 - val_loss: 1.2049 Epoch 198/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9532 - loss: 0.1658 - val_accuracy: 0.5200 - val_loss: 1.2085 Epoch 199/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9933 - loss: 0.1213 - val_accuracy: 0.5200 - val_loss: 1.2210 Epoch 200/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9809 - loss: 0.1374 - val_accuracy: 0.5600 - val_loss: 1.2298 Epoch 201/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9698 - loss: 0.1493 - val_accuracy: 0.5600 - val_loss: 1.2291 Epoch 202/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9793 - loss: 0.1445 - val_accuracy: 0.5600 - val_loss: 1.2328 Epoch 203/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9891 - loss: 0.0996 - val_accuracy: 0.5600 - val_loss: 1.2087 Epoch 204/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9759 - loss: 0.1854 - val_accuracy: 0.5600 - val_loss: 1.1759 Epoch 205/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9876 - loss: 0.1142 - val_accuracy: 0.5600 - val_loss: 1.1498 Epoch 206/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9803 - loss: 0.1168 - val_accuracy: 0.5600 - val_loss: 1.1397 Epoch 207/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9387 - loss: 0.1620 - val_accuracy: 0.5600 - val_loss: 1.1585 Epoch 208/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 1.0000 - loss: 0.1242 - val_accuracy: 0.5600 - val_loss: 1.1746 Epoch 209/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9335 - loss: 0.2594 - val_accuracy: 0.5600 - val_loss: 1.1809 Epoch 210/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9263 - loss: 0.2466 - val_accuracy: 0.5200 - val_loss: 1.2428 Epoch 211/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9444 - loss: 0.2149 - val_accuracy: 0.5600 - val_loss: 1.2445 Epoch 212/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9582 - loss: 0.1815 - val_accuracy: 0.6000 - val_loss: 1.2124 Epoch 213/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9497 - loss: 0.1944 - val_accuracy: 0.5600 - val_loss: 1.2016 Epoch 214/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9796 - loss: 0.1504 - val_accuracy: 0.5600 - val_loss: 1.2013 Epoch 215/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9403 - loss: 0.1578 - val_accuracy: 0.5600 - val_loss: 1.2598 Epoch 216/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9503 - loss: 0.2754 - val_accuracy: 0.6400 - val_loss: 1.2682 Epoch 217/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9803 - loss: 0.1270 - val_accuracy: 0.6000 - val_loss: 1.3079 Epoch 218/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9380 - loss: 0.1947 - val_accuracy: 0.6000 - val_loss: 1.3696 Epoch 219/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9589 - loss: 0.1368 - val_accuracy: 0.6000 - val_loss: 1.4374 Epoch 220/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9863 - loss: 0.1238 - val_accuracy: 0.5600 - val_loss: 1.5153 Epoch 221/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9738 - loss: 0.1320 - val_accuracy: 0.5600 - val_loss: 1.5700 Epoch 222/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9290 - loss: 0.1466 - val_accuracy: 0.5600 - val_loss: 1.6022 Epoch 223/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9009 - loss: 0.2581 - val_accuracy: 0.5600 - val_loss: 1.5669 Epoch 224/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9487 - loss: 0.1508 - val_accuracy: 0.5600 - val_loss: 1.5068 Epoch 225/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9290 - loss: 0.2018 - val_accuracy: 0.5600 - val_loss: 1.5059 Epoch 226/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9809 - loss: 0.1095 - val_accuracy: 0.5600 - val_loss: 1.4921 Epoch 227/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9705 - loss: 0.1448 - val_accuracy: 0.5600 - val_loss: 1.4641 Epoch 228/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9721 - loss: 0.1497 - val_accuracy: 0.6000 - val_loss: 1.4620 Epoch 229/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9507 - loss: 0.1821 - val_accuracy: 0.6000 - val_loss: 1.4386 Epoch 230/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.8753 - loss: 0.3179 - val_accuracy: 0.5200 - val_loss: 1.3097 Epoch 231/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9553 - loss: 0.1443 - val_accuracy: 0.6000 - val_loss: 1.2327 Epoch 232/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9769 - loss: 0.1224 - val_accuracy: 0.6000 - val_loss: 1.1621 Epoch 233/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9461 - loss: 0.2060 - val_accuracy: 0.6400 - val_loss: 1.1107 Epoch 234/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9754 - loss: 0.1321 - val_accuracy: 0.6400 - val_loss: 1.0963 Epoch 235/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9588 - loss: 0.1411 - val_accuracy: 0.6800 - val_loss: 1.0265 Epoch 236/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9601 - loss: 0.1325 - val_accuracy: 0.6400 - val_loss: 0.9242 Epoch 237/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.9775 - loss: 0.1257 - val_accuracy: 0.6400 - val_loss: 0.8814 Epoch 238/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9822 - loss: 0.1151 - val_accuracy: 0.6400 - val_loss: 0.8611 Epoch 239/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9601 - loss: 0.1137 - val_accuracy: 0.6400 - val_loss: 0.8620 Epoch 240/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9366 - loss: 0.1873 - val_accuracy: 0.6800 - val_loss: 0.8792 Epoch 241/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9223 - loss: 0.2301 - val_accuracy: 0.6400 - val_loss: 0.9009 Epoch 242/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9635 - loss: 0.1440 - val_accuracy: 0.6400 - val_loss: 0.9072 Epoch 243/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9876 - loss: 0.1451 - val_accuracy: 0.6400 - val_loss: 0.9101 Epoch 244/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9733 - loss: 0.1613 - val_accuracy: 0.6000 - val_loss: 0.8990 Epoch 245/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9504 - loss: 0.1472 - val_accuracy: 0.6000 - val_loss: 0.8861 Epoch 246/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9752 - loss: 0.1231 - val_accuracy: 0.6400 - val_loss: 0.8603 Epoch 247/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9254 - loss: 0.2222 - val_accuracy: 0.6800 - val_loss: 0.8459 Epoch 248/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9256 - loss: 0.2066 - val_accuracy: 0.6800 - val_loss: 0.8472 Epoch 249/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9492 - loss: 0.2036 - val_accuracy: 0.7200 - val_loss: 0.8513 Epoch 250/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9891 - loss: 0.0615 - val_accuracy: 0.7200 - val_loss: 0.8615 Epoch 251/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 1.0000 - loss: 0.1354 - val_accuracy: 0.6400 - val_loss: 0.8754 Epoch 252/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9408 - loss: 0.1365 - val_accuracy: 0.6800 - val_loss: 0.8683 Epoch 253/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9754 - loss: 0.1071 - val_accuracy: 0.6800 - val_loss: 0.8571 Epoch 254/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9863 - loss: 0.1441 - val_accuracy: 0.6800 - val_loss: 0.8248 Epoch 255/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9793 - loss: 0.0837 - val_accuracy: 0.6800 - val_loss: 0.8185 Epoch 256/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9601 - loss: 0.1215 - val_accuracy: 0.6800 - val_loss: 0.8277 Epoch 257/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9739 - loss: 0.1031 - val_accuracy: 0.6400 - val_loss: 0.8363 Epoch 258/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9760 - loss: 0.1191 - val_accuracy: 0.6800 - val_loss: 0.8115 Epoch 259/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9966 - loss: 0.0838 - val_accuracy: 0.7200 - val_loss: 0.7921 Epoch 260/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9850 - loss: 0.0806 - val_accuracy: 0.7600 - val_loss: 0.7687 Epoch 261/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9669 - loss: 0.1452 - val_accuracy: 0.8000 - val_loss: 0.7469 Epoch 262/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9945 - loss: 0.0806 - val_accuracy: 0.7600 - val_loss: 0.7396 Epoch 263/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9180 - loss: 0.2361 - val_accuracy: 0.7200 - val_loss: 0.7498 Epoch 264/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9809 - loss: 0.1068 - val_accuracy: 0.7200 - val_loss: 0.7608 Epoch 265/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9884 - loss: 0.1189 - val_accuracy: 0.6800 - val_loss: 0.7716 Epoch 266/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9891 - loss: 0.0939 - val_accuracy: 0.6800 - val_loss: 0.8052 Epoch 267/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9586 - loss: 0.1093 - val_accuracy: 0.6800 - val_loss: 0.8128 Epoch 268/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9581 - loss: 0.1308 - val_accuracy: 0.6800 - val_loss: 0.8458 Epoch 269/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9490 - loss: 0.1269 - val_accuracy: 0.6800 - val_loss: 0.8428 Epoch 270/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9629 - loss: 0.0915 - val_accuracy: 0.6800 - val_loss: 0.8552 Epoch 271/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9574 - loss: 0.1456 - val_accuracy: 0.6800 - val_loss: 0.9069 Epoch 272/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9594 - loss: 0.0873 - val_accuracy: 0.6800 - val_loss: 0.9523 Epoch 273/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9793 - loss: 0.1129 - val_accuracy: 0.6800 - val_loss: 0.9951 Epoch 274/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9876 - loss: 0.0748 - val_accuracy: 0.6800 - val_loss: 1.0009 Epoch 275/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9878 - loss: 0.0679 - val_accuracy: 0.6800 - val_loss: 0.9950 Epoch 276/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9878 - loss: 0.0949 - val_accuracy: 0.6800 - val_loss: 0.9799 Epoch 277/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.9793 - loss: 0.0665 - val_accuracy: 0.6800 - val_loss: 0.9296 Epoch 278/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9714 - loss: 0.1172 - val_accuracy: 0.6800 - val_loss: 0.9079 Epoch 279/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9532 - loss: 0.1376 - val_accuracy: 0.6800 - val_loss: 0.9363 Epoch 280/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9830 - loss: 0.0943 - val_accuracy: 0.6800 - val_loss: 0.9509 Epoch 281/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9794 - loss: 0.0784 - val_accuracy: 0.6800 - val_loss: 0.9197 Epoch 282/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9643 - loss: 0.1283 - val_accuracy: 0.6800 - val_loss: 0.9123 Epoch 283/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9876 - loss: 0.1319 - val_accuracy: 0.6800 - val_loss: 0.9163 Epoch 284/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9160 - loss: 0.1717 - val_accuracy: 0.6800 - val_loss: 0.9543 Epoch 285/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9918 - loss: 0.0358 - val_accuracy: 0.6400 - val_loss: 0.9807 Epoch 286/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9850 - loss: 0.0692 - val_accuracy: 0.6400 - val_loss: 0.9801 Epoch 287/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9656 - loss: 0.1161 - val_accuracy: 0.6800 - val_loss: 0.9725 Epoch 288/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - accuracy: 0.9534 - loss: 0.1510 - val_accuracy: 0.6800 - val_loss: 0.9662 Epoch 289/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9781 - loss: 0.0988 - val_accuracy: 0.7200 - val_loss: 0.9126 Epoch 290/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9768 - loss: 0.0931 - val_accuracy: 0.7200 - val_loss: 0.9003 Epoch 291/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9484 - loss: 0.1763 - val_accuracy: 0.7200 - val_loss: 0.9239 Epoch 292/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 1.0000 - loss: 0.0597 - val_accuracy: 0.6400 - val_loss: 1.1420 Epoch 293/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9788 - loss: 0.0933 - val_accuracy: 0.6400 - val_loss: 1.2226 Epoch 294/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9615 - loss: 0.1950 - val_accuracy: 0.6400 - val_loss: 1.2819 Epoch 295/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9656 - loss: 0.0880 - val_accuracy: 0.6400 - val_loss: 1.3293 Epoch 296/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - accuracy: 0.9733 - loss: 0.1183 - val_accuracy: 0.6000 - val_loss: 1.3543 Epoch 297/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9788 - loss: 0.0733 - val_accuracy: 0.6400 - val_loss: 1.3087 Epoch 298/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - accuracy: 0.9503 - loss: 0.1295 - val_accuracy: 0.6400 - val_loss: 1.2352 Epoch 299/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9407 - loss: 0.1359 - val_accuracy: 0.6400 - val_loss: 1.1712 Epoch 300/300 5/5 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.9614 - loss: 0.1265 - val_accuracy: 0.6400 - val_loss: 1.1698 CPU times: user 54.6 s, sys: 9.17 s, total: 1min 3s Wall time: 56.2 s 5/5 [==============================] - 0s 11ms/step - loss: 0.6206 - accuracy: 0.8182 - val_loss: 1.1899 - val_accuracy: 0.5200 Epoch 60/300 5/5 [==============================] - 0s 10ms/step - loss: 0.6564 - accuracy: 0.7576 - val_loss: 1.1962 - val_accuracy: 0.5200 Epoch 61/300 5/5 [==============================] - 0s 30ms/step - loss: 0.7866 - accuracy: 0.7172 - val_loss: 1.1957 - val_accuracy: 0.5200 Epoch 62/300 5/5 [==============================] - 0s 10ms/step - loss: 0.6576 - accuracy: 0.7273 - val_loss: 1.2127 - val_accuracy: 0.5200 Epoch 63/300 5/5 [==============================] - 0s 10ms/step - loss: 0.5989 - accuracy: 0.7879 - val_loss: 1.2209 - val_accuracy: 0.5200 Epoch 64/300 5/5 [==============================] - 0s 10ms/step - loss: 0.5561 - accuracy: 0.8283 - val_loss: 1.2014 - val_accuracy: 0.5200 Epoch 65/300 5/5 [==============================] - 0s 10ms/step - loss: 0.5866 - accuracy: 0.8485 - val_loss: 1.1845 - val_accuracy: 0.5200 Epoch 66/300 5/5 [==============================] - 0s 9ms/step - loss: 0.5624 - accuracy: 0.7879 - val_loss: 1.1667 - val_accuracy: 0.5200 Epoch 67/300 5/5 [==============================] - 0s 10ms/step - loss: 0.5939 - accuracy: 0.8182 - val_loss: 1.1534 - val_accuracy: 0.5200 Epoch 68/300 5/5 [==============================] - 0s 10ms/step - loss: 0.6402 - accuracy: 0.7576 - val_loss: 1.1429 - val_accuracy: 0.5200 Epoch 69/300 5/5 [==============================] - 0s 9ms/step - loss: 0.6282 - accuracy: 0.8182 - val_loss: 1.1326 - val_accuracy: 0.5600 Epoch 70/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4443 - accuracy: 0.8788 - val_loss: 1.1170 - val_accuracy: 0.5600 Epoch 71/300 5/5 [==============================] - 0s 9ms/step - loss: 0.4632 - accuracy: 0.8485 - val_loss: 1.1055 - val_accuracy: 0.5600 Epoch 72/300 5/5 [==============================] - 0s 10ms/step - loss: 0.5103 - accuracy: 0.8485 - val_loss: 1.0968 - val_accuracy: 0.5600 Epoch 73/300 5/5 [==============================] - 0s 15ms/step - loss: 0.4771 - accuracy: 0.8889 - val_loss: 1.0730 - val_accuracy: 0.6000 Epoch 74/300 5/5 [==============================] - 0s 10ms/step - loss: 0.5252 - accuracy: 0.8485 - val_loss: 1.0790 - val_accuracy: 0.6400 Epoch 75/300 5/5 [==============================] - 0s 28ms/step - loss: 0.5182 - accuracy: 0.8384 - val_loss: 1.0909 - val_accuracy: 0.6000 Epoch 76/300 5/5 [==============================] - 0s 11ms/step - loss: 0.3781 - accuracy: 0.9091 - val_loss: 1.1057 - val_accuracy: 0.6000 Epoch 77/300 5/5 [==============================] - 0s 16ms/step - loss: 0.4580 - accuracy: 0.8384 - val_loss: 1.1166 - val_accuracy: 0.6000 Epoch 78/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4712 - accuracy: 0.8788 - val_loss: 1.1395 - val_accuracy: 0.6000 Epoch 79/300 5/5 [==============================] - 0s 18ms/step - loss: 0.4269 - accuracy: 0.8586 - val_loss: 1.1721 - val_accuracy: 0.5200 Epoch 80/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4910 - accuracy: 0.7980 - val_loss: 1.1820 - val_accuracy: 0.5200 Epoch 81/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4699 - accuracy: 0.8687 - val_loss: 1.1861 - val_accuracy: 0.5600 Epoch 82/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4194 - accuracy: 0.8788 - val_loss: 1.1940 - val_accuracy: 0.5600 Epoch 83/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4095 - accuracy: 0.8687 - val_loss: 1.1971 - val_accuracy: 0.5600 Epoch 84/300 5/5 [==============================] - 0s 9ms/step - loss: 0.4048 - accuracy: 0.8788 - val_loss: 1.1910 - val_accuracy: 0.5600 Epoch 85/300 5/5 [==============================] - 0s 11ms/step - loss: 0.5509 - accuracy: 0.8081 - val_loss: 1.1904 - val_accuracy: 0.4800 Epoch 86/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4884 - accuracy: 0.8283 - val_loss: 1.1950 - val_accuracy: 0.4800 Epoch 87/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4196 - accuracy: 0.8889 - val_loss: 1.1899 - val_accuracy: 0.5200 Epoch 88/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4157 - accuracy: 0.8485 - val_loss: 1.2238 - val_accuracy: 0.5200 Epoch 89/300 5/5 [==============================] - 0s 9ms/step - loss: 0.4900 - accuracy: 0.8485 - val_loss: 1.2246 - val_accuracy: 0.5200 Epoch 90/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3359 - accuracy: 0.9091 - val_loss: 1.2065 - val_accuracy: 0.5200 Epoch 91/300 5/5 [==============================] - 0s 9ms/step - loss: 0.4218 - accuracy: 0.8788 - val_loss: 1.2021 - val_accuracy: 0.5200 Epoch 92/300 5/5 [==============================] - 0s 9ms/step - loss: 0.5559 - accuracy: 0.8081 - val_loss: 1.2235 - val_accuracy: 0.5200 Epoch 93/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4268 - accuracy: 0.8485 - val_loss: 1.2393 - val_accuracy: 0.5200 Epoch 94/300 5/5 [==============================] - 0s 9ms/step - loss: 0.3674 - accuracy: 0.9293 - val_loss: 1.2544 - val_accuracy: 0.5200 Epoch 95/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3332 - accuracy: 0.9192 - val_loss: 1.2941 - val_accuracy: 0.5200 Epoch 96/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3505 - accuracy: 0.8990 - val_loss: 1.3192 - val_accuracy: 0.5200 Epoch 97/300 5/5 [==============================] - 0s 9ms/step - loss: 0.3442 - accuracy: 0.9091 - val_loss: 1.3132 - val_accuracy: 0.5200 Epoch 98/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2849 - accuracy: 0.9495 - val_loss: 1.2946 - val_accuracy: 0.5200 Epoch 99/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4299 - accuracy: 0.8687 - val_loss: 1.2577 - val_accuracy: 0.5200 Epoch 100/300 5/5 [==============================] - 0s 36ms/step - loss: 0.3711 - accuracy: 0.8788 - val_loss: 1.2366 - val_accuracy: 0.5200 Epoch 101/300 5/5 [==============================] - 0s 10ms/step - loss: 0.4541 - accuracy: 0.8485 - val_loss: 1.2062 - val_accuracy: 0.5200 Epoch 102/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3570 - accuracy: 0.8687 - val_loss: 1.1883 - val_accuracy: 0.5200 Epoch 103/300 5/5 [==============================] - 0s 9ms/step - loss: 0.3579 - accuracy: 0.8990 - val_loss: 1.1813 - val_accuracy: 0.5200 Epoch 104/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2861 - accuracy: 0.9293 - val_loss: 1.1807 - val_accuracy: 0.5200 Epoch 105/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3789 - accuracy: 0.8586 - val_loss: 1.1829 - val_accuracy: 0.5200 Epoch 106/300 5/5 [==============================] - 0s 9ms/step - loss: 0.3457 - accuracy: 0.8990 - val_loss: 1.2079 - val_accuracy: 0.5200 Epoch 107/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2590 - accuracy: 0.9394 - val_loss: 1.2311 - val_accuracy: 0.5200 Epoch 108/300 5/5 [==============================] - 0s 11ms/step - loss: 0.3527 - accuracy: 0.9091 - val_loss: 1.2570 - val_accuracy: 0.5200 Epoch 109/300 5/5 [==============================] - 0s 9ms/step - loss: 0.3358 - accuracy: 0.8889 - val_loss: 1.2947 - val_accuracy: 0.5200 Epoch 110/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2534 - accuracy: 0.9394 - val_loss: 1.3408 - val_accuracy: 0.5200 Epoch 111/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2463 - accuracy: 0.9394 - val_loss: 1.3770 - val_accuracy: 0.5200 Epoch 112/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3498 - accuracy: 0.9091 - val_loss: 1.3980 - val_accuracy: 0.5200 Epoch 113/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3142 - accuracy: 0.9192 - val_loss: 1.3979 - val_accuracy: 0.5200 Epoch 114/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2258 - accuracy: 0.9596 - val_loss: 1.3949 - val_accuracy: 0.5200 Epoch 115/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3503 - accuracy: 0.9192 - val_loss: 1.3928 - val_accuracy: 0.5200 Epoch 116/300 5/5 [==============================] - 0s 9ms/step - loss: 0.3959 - accuracy: 0.8990 - val_loss: 1.3936 - val_accuracy: 0.5200 Epoch 117/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3461 - accuracy: 0.9091 - val_loss: 1.4017 - val_accuracy: 0.5200 Epoch 118/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2749 - accuracy: 0.9293 - val_loss: 1.4229 - val_accuracy: 0.5200 Epoch 119/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2717 - accuracy: 0.9293 - val_loss: 1.4275 - val_accuracy: 0.5200 Epoch 120/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2813 - accuracy: 0.9293 - val_loss: 1.4212 - val_accuracy: 0.5200 Epoch 121/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2984 - accuracy: 0.8889 - val_loss: 1.4083 - val_accuracy: 0.5200 Epoch 122/300 5/5 [==============================] - 0s 12ms/step - loss: 0.3248 - accuracy: 0.9192 - val_loss: 1.3932 - val_accuracy: 0.5200 Epoch 123/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2410 - accuracy: 0.9394 - val_loss: 1.3802 - val_accuracy: 0.5200 Epoch 124/300 5/5 [==============================] - 0s 10ms/step - loss: 0.3152 - accuracy: 0.8889 - val_loss: 1.3707 - val_accuracy: 0.5200 Epoch 125/300 5/5 [==============================] - 0s 11ms/step - loss: 0.3380 - accuracy: 0.8889 - val_loss: 1.3422 - val_accuracy: 0.5200 Epoch 126/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2511 - accuracy: 0.9192 - val_loss: 1.3290 - val_accuracy: 0.5200 Epoch 127/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2840 - accuracy: 0.9192 - val_loss: 1.3449 - val_accuracy: 0.5200 Epoch 128/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2495 - accuracy: 0.9495 - val_loss: 1.3590 - val_accuracy: 0.5200 Epoch 129/300 5/5 [==============================] - 0s 14ms/step - loss: 0.2076 - accuracy: 0.9495 - val_loss: 1.3600 - val_accuracy: 0.5200 Epoch 130/300 5/5 [==============================] - 0s 14ms/step - loss: 0.2681 - accuracy: 0.9394 - val_loss: 1.3473 - val_accuracy: 0.5200 Epoch 131/300 5/5 [==============================] - 0s 16ms/step - loss: 0.2488 - accuracy: 0.9293 - val_loss: 1.3310 - val_accuracy: 0.5200 Epoch 132/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2132 - accuracy: 0.9596 - val_loss: 1.3315 - val_accuracy: 0.5200 Epoch 133/300 5/5 [==============================] - 0s 14ms/step - loss: 0.2855 - accuracy: 0.9192 - val_loss: 1.3284 - val_accuracy: 0.5200 Epoch 134/300 5/5 [==============================] - 0s 56ms/step - loss: 0.2401 - accuracy: 0.9495 - val_loss: 1.3312 - val_accuracy: 0.5200 Epoch 135/300 5/5 [==============================] - 0s 17ms/step - loss: 0.3005 - accuracy: 0.8990 - val_loss: 1.3492 - val_accuracy: 0.5200 Epoch 136/300 5/5 [==============================] - 0s 16ms/step - loss: 0.2757 - accuracy: 0.9192 - val_loss: 1.3810 - val_accuracy: 0.5200 Epoch 137/300 5/5 [==============================] - 0s 31ms/step - loss: 0.1777 - accuracy: 0.9697 - val_loss: 1.4349 - val_accuracy: 0.5200 Epoch 138/300 5/5 [==============================] - 0s 15ms/step - loss: 0.2364 - accuracy: 0.9596 - val_loss: 1.4826 - val_accuracy: 0.5200 Epoch 139/300 5/5 [==============================] - 0s 16ms/step - loss: 0.1990 - accuracy: 0.9798 - val_loss: 1.5212 - val_accuracy: 0.5200 Epoch 140/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2061 - accuracy: 0.9293 - val_loss: 1.5476 - val_accuracy: 0.5200 Epoch 141/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2731 - accuracy: 0.9192 - val_loss: 1.5597 - val_accuracy: 0.4800 Epoch 142/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2449 - accuracy: 0.9394 - val_loss: 1.5239 - val_accuracy: 0.4800 Epoch 143/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1948 - accuracy: 0.9596 - val_loss: 1.5008 - val_accuracy: 0.4800 Epoch 144/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1962 - accuracy: 0.9596 - val_loss: 1.4953 - val_accuracy: 0.4800 Epoch 145/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2167 - accuracy: 0.9192 - val_loss: 1.4841 - val_accuracy: 0.4800 Epoch 146/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2214 - accuracy: 0.9495 - val_loss: 1.4630 - val_accuracy: 0.4800 Epoch 147/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2975 - accuracy: 0.9192 - val_loss: 1.4711 - val_accuracy: 0.4800 Epoch 148/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2330 - accuracy: 0.9495 - val_loss: 1.4720 - val_accuracy: 0.4800 Epoch 149/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2274 - accuracy: 0.9192 - val_loss: 1.4679 - val_accuracy: 0.4800 Epoch 150/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1811 - accuracy: 0.9697 - val_loss: 1.4592 - val_accuracy: 0.4800 Epoch 151/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2102 - accuracy: 0.9596 - val_loss: 1.4627 - val_accuracy: 0.5200 Epoch 152/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2380 - accuracy: 0.9495 - val_loss: 1.4665 - val_accuracy: 0.5200 Epoch 153/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1704 - accuracy: 0.9596 - val_loss: 1.4623 - val_accuracy: 0.5200 Epoch 154/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1881 - accuracy: 0.9495 - val_loss: 1.4365 - val_accuracy: 0.5200 Epoch 155/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1582 - accuracy: 0.9697 - val_loss: 1.4168 - val_accuracy: 0.5200 Epoch 156/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1781 - accuracy: 0.9394 - val_loss: 1.4047 - val_accuracy: 0.5200 Epoch 157/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2442 - accuracy: 0.8990 - val_loss: 1.3924 - val_accuracy: 0.5200 Epoch 158/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2896 - accuracy: 0.9091 - val_loss: 1.3655 - val_accuracy: 0.5200 Epoch 159/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1905 - accuracy: 0.9596 - val_loss: 1.3695 - val_accuracy: 0.5200 Epoch 160/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1716 - accuracy: 0.9495 - val_loss: 1.4077 - val_accuracy: 0.5200 Epoch 161/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1918 - accuracy: 0.9495 - val_loss: 1.4592 - val_accuracy: 0.5200 Epoch 162/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1977 - accuracy: 0.9495 - val_loss: 1.4918 - val_accuracy: 0.5200 Epoch 163/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1562 - accuracy: 0.9697 - val_loss: 1.4947 - val_accuracy: 0.5200 Epoch 164/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2337 - accuracy: 0.9394 - val_loss: 1.4922 - val_accuracy: 0.5200 Epoch 165/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2967 - accuracy: 0.8788 - val_loss: 1.4810 - val_accuracy: 0.5200 Epoch 166/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2716 - accuracy: 0.9091 - val_loss: 1.4867 - val_accuracy: 0.5200 Epoch 167/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1742 - accuracy: 0.9697 - val_loss: 1.4898 - val_accuracy: 0.5200 Epoch 168/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1361 - accuracy: 0.9798 - val_loss: 1.5008 - val_accuracy: 0.5200 Epoch 169/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1318 - accuracy: 0.9697 - val_loss: 1.4963 - val_accuracy: 0.5200 Epoch 170/300 5/5 [==============================] - 0s 9ms/step - loss: 0.2049 - accuracy: 0.9697 - val_loss: 1.5031 - val_accuracy: 0.5200 Epoch 171/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1784 - accuracy: 0.9596 - val_loss: 1.5263 - val_accuracy: 0.5200 Epoch 172/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1218 - accuracy: 0.9697 - val_loss: 1.5500 - val_accuracy: 0.5200 Epoch 173/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2701 - accuracy: 0.9091 - val_loss: 1.5800 - val_accuracy: 0.5200 Epoch 174/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1305 - accuracy: 0.9899 - val_loss: 1.5986 - val_accuracy: 0.5200 Epoch 175/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1078 - accuracy: 0.9899 - val_loss: 1.5982 - val_accuracy: 0.4800 Epoch 176/300 5/5 [==============================] - 0s 31ms/step - loss: 0.1492 - accuracy: 0.9495 - val_loss: 1.5742 - val_accuracy: 0.4800 Epoch 177/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2203 - accuracy: 0.9293 - val_loss: 1.5666 - val_accuracy: 0.4800 Epoch 178/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1498 - accuracy: 0.9596 - val_loss: 1.5672 - val_accuracy: 0.4800 Epoch 179/300 5/5 [==============================] - 0s 12ms/step - loss: 0.2119 - accuracy: 0.9394 - val_loss: 1.5702 - val_accuracy: 0.4800 Epoch 180/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2052 - accuracy: 0.9394 - val_loss: 1.6045 - val_accuracy: 0.4800 Epoch 181/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1686 - accuracy: 0.9596 - val_loss: 1.6179 - val_accuracy: 0.4800 Epoch 182/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1190 - accuracy: 0.9899 - val_loss: 1.6063 - val_accuracy: 0.4800 Epoch 183/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1796 - accuracy: 0.9697 - val_loss: 1.6009 - val_accuracy: 0.4800 Epoch 184/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2071 - accuracy: 0.9394 - val_loss: 1.6223 - val_accuracy: 0.4800 Epoch 185/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1907 - accuracy: 0.9495 - val_loss: 1.6199 - val_accuracy: 0.4800 Epoch 186/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1701 - accuracy: 0.9697 - val_loss: 1.5943 - val_accuracy: 0.4800 Epoch 187/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1242 - accuracy: 0.9798 - val_loss: 1.5534 - val_accuracy: 0.4800 Epoch 188/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1074 - accuracy: 0.9899 - val_loss: 1.5380 - val_accuracy: 0.4800 Epoch 189/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1133 - accuracy: 0.9798 - val_loss: 1.5176 - val_accuracy: 0.4800 Epoch 190/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2135 - accuracy: 0.9192 - val_loss: 1.5239 - val_accuracy: 0.4800 Epoch 191/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1153 - accuracy: 1.0000 - val_loss: 1.5022 - val_accuracy: 0.4800 Epoch 192/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1657 - accuracy: 0.9697 - val_loss: 1.4831 - val_accuracy: 0.4800 Epoch 193/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1709 - accuracy: 0.9596 - val_loss: 1.5013 - val_accuracy: 0.4800 Epoch 194/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1047 - accuracy: 0.9697 - val_loss: 1.5316 - val_accuracy: 0.4800 Epoch 195/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1890 - accuracy: 0.9394 - val_loss: 1.5205 - val_accuracy: 0.4800 Epoch 196/300 5/5 [==============================] - 0s 15ms/step - loss: 0.1211 - accuracy: 0.9899 - val_loss: 1.4993 - val_accuracy: 0.4800 Epoch 197/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1964 - accuracy: 0.9697 - val_loss: 1.4950 - val_accuracy: 0.4800 Epoch 198/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1687 - accuracy: 0.9596 - val_loss: 1.4893 - val_accuracy: 0.4800 Epoch 199/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1903 - accuracy: 0.9495 - val_loss: 1.4987 - val_accuracy: 0.4800 Epoch 200/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1121 - accuracy: 0.9798 - val_loss: 1.5120 - val_accuracy: 0.5200 Epoch 201/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1858 - accuracy: 0.9293 - val_loss: 1.5310 - val_accuracy: 0.4800 Epoch 202/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1466 - accuracy: 0.9798 - val_loss: 1.5289 - val_accuracy: 0.4800 Epoch 203/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1250 - accuracy: 0.9697 - val_loss: 1.5239 - val_accuracy: 0.4800 Epoch 204/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1730 - accuracy: 0.9495 - val_loss: 1.4926 - val_accuracy: 0.4800 Epoch 205/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0957 - accuracy: 0.9899 - val_loss: 1.4790 - val_accuracy: 0.4800 Epoch 206/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1037 - accuracy: 0.9899 - val_loss: 1.4629 - val_accuracy: 0.4800 Epoch 207/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1392 - accuracy: 0.9697 - val_loss: 1.4683 - val_accuracy: 0.5600 Epoch 208/300 5/5 [==============================] - 0s 12ms/step - loss: 0.2097 - accuracy: 0.9293 - val_loss: 1.4372 - val_accuracy: 0.5600 Epoch 209/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1829 - accuracy: 0.9293 - val_loss: 1.3519 - val_accuracy: 0.5600 Epoch 210/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1208 - accuracy: 0.9697 - val_loss: 1.3036 - val_accuracy: 0.5600 Epoch 211/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2503 - accuracy: 0.9293 - val_loss: 1.2680 - val_accuracy: 0.5600 Epoch 212/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1507 - accuracy: 0.9596 - val_loss: 1.2557 - val_accuracy: 0.5600 Epoch 213/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1033 - accuracy: 0.9798 - val_loss: 1.2492 - val_accuracy: 0.5600 Epoch 214/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1370 - accuracy: 0.9596 - val_loss: 1.2528 - val_accuracy: 0.5600 Epoch 215/300 5/5 [==============================] - 0s 13ms/step - loss: 0.1637 - accuracy: 0.9495 - val_loss: 1.2701 - val_accuracy: 0.5600 Epoch 216/300 5/5 [==============================] - 0s 13ms/step - loss: 0.1563 - accuracy: 0.9495 - val_loss: 1.3010 - val_accuracy: 0.5200 Epoch 217/300 5/5 [==============================] - 0s 14ms/step - loss: 0.1341 - accuracy: 0.9596 - val_loss: 1.3261 - val_accuracy: 0.4800 Epoch 218/300 5/5 [==============================] - 0s 16ms/step - loss: 0.2497 - accuracy: 0.9697 - val_loss: 1.3175 - val_accuracy: 0.4800 Epoch 219/300 5/5 [==============================] - 0s 13ms/step - loss: 0.1408 - accuracy: 0.9697 - val_loss: 1.3323 - val_accuracy: 0.4800 Epoch 220/300 5/5 [==============================] - 0s 13ms/step - loss: 0.1871 - accuracy: 0.9293 - val_loss: 1.3988 - val_accuracy: 0.4800 Epoch 221/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1606 - accuracy: 0.9394 - val_loss: 1.4870 - val_accuracy: 0.4800 Epoch 222/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1436 - accuracy: 0.9394 - val_loss: 1.5700 - val_accuracy: 0.4800 Epoch 223/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0884 - accuracy: 0.9899 - val_loss: 1.6129 - val_accuracy: 0.4800 Epoch 224/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1655 - accuracy: 0.9495 - val_loss: 1.6738 - val_accuracy: 0.4800 Epoch 225/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1821 - accuracy: 0.9596 - val_loss: 1.7443 - val_accuracy: 0.4800 Epoch 226/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1641 - accuracy: 0.9596 - val_loss: 1.7843 - val_accuracy: 0.4800 Epoch 227/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1175 - accuracy: 0.9394 - val_loss: 1.8035 - val_accuracy: 0.4800 Epoch 228/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1767 - accuracy: 0.9495 - val_loss: 1.7560 - val_accuracy: 0.4800 Epoch 229/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1413 - accuracy: 0.9596 - val_loss: 1.7311 - val_accuracy: 0.4800 Epoch 230/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0871 - accuracy: 0.9697 - val_loss: 1.6755 - val_accuracy: 0.4800 Epoch 231/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1414 - accuracy: 0.9293 - val_loss: 1.6216 - val_accuracy: 0.4800 Epoch 232/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1162 - accuracy: 0.9697 - val_loss: 1.5832 - val_accuracy: 0.4800 Epoch 233/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1137 - accuracy: 0.9798 - val_loss: 1.5216 - val_accuracy: 0.4800 Epoch 234/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1481 - accuracy: 0.9394 - val_loss: 1.4967 - val_accuracy: 0.4800 Epoch 235/300 5/5 [==============================] - 0s 13ms/step - loss: 0.0805 - accuracy: 0.9899 - val_loss: 1.4890 - val_accuracy: 0.4800 Epoch 236/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1191 - accuracy: 0.9697 - val_loss: 1.4920 - val_accuracy: 0.4800 Epoch 237/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1142 - accuracy: 1.0000 - val_loss: 1.5058 - val_accuracy: 0.4800 Epoch 238/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1674 - accuracy: 0.9192 - val_loss: 1.4954 - val_accuracy: 0.5200 Epoch 239/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1246 - accuracy: 0.9596 - val_loss: 1.4776 - val_accuracy: 0.5200 Epoch 240/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0915 - accuracy: 0.9899 - val_loss: 1.4703 - val_accuracy: 0.5200 Epoch 241/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1080 - accuracy: 0.9697 - val_loss: 1.4913 - val_accuracy: 0.5200 Epoch 242/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1976 - accuracy: 0.9798 - val_loss: 1.5185 - val_accuracy: 0.5200 Epoch 243/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1357 - accuracy: 0.9596 - val_loss: 1.5415 - val_accuracy: 0.5200 Epoch 244/300 5/5 [==============================] - 0s 11ms/step - loss: 0.2367 - accuracy: 0.9495 - val_loss: 1.6021 - val_accuracy: 0.5600 Epoch 245/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1714 - accuracy: 0.9192 - val_loss: 1.6069 - val_accuracy: 0.5600 Epoch 246/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1384 - accuracy: 0.9394 - val_loss: 1.5733 - val_accuracy: 0.5600 Epoch 247/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1623 - accuracy: 0.9697 - val_loss: 1.5718 - val_accuracy: 0.5200 Epoch 248/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0526 - accuracy: 0.9899 - val_loss: 1.5915 - val_accuracy: 0.4800 Epoch 249/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2460 - accuracy: 0.9495 - val_loss: 1.6528 - val_accuracy: 0.4800 Epoch 250/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0818 - accuracy: 0.9798 - val_loss: 1.7130 - val_accuracy: 0.4800 Epoch 251/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1450 - accuracy: 0.9697 - val_loss: 1.7354 - val_accuracy: 0.4800 Epoch 252/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0954 - accuracy: 0.9798 - val_loss: 1.7183 - val_accuracy: 0.4800 Epoch 253/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1407 - accuracy: 0.9495 - val_loss: 1.6969 - val_accuracy: 0.4800 Epoch 254/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0943 - accuracy: 0.9899 - val_loss: 1.6876 - val_accuracy: 0.4800 Epoch 255/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0840 - accuracy: 0.9798 - val_loss: 1.6636 - val_accuracy: 0.4800 Epoch 256/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1189 - accuracy: 0.9697 - val_loss: 1.6280 - val_accuracy: 0.4800 Epoch 257/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1279 - accuracy: 0.9798 - val_loss: 1.6123 - val_accuracy: 0.4800 Epoch 258/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1504 - accuracy: 0.9596 - val_loss: 1.6306 - val_accuracy: 0.4800 Epoch 259/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1448 - accuracy: 0.9596 - val_loss: 1.6611 - val_accuracy: 0.4800 Epoch 260/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1440 - accuracy: 0.9697 - val_loss: 1.6805 - val_accuracy: 0.4800 Epoch 261/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1078 - accuracy: 0.9798 - val_loss: 1.7369 - val_accuracy: 0.4800 Epoch 262/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1182 - accuracy: 0.9596 - val_loss: 1.7528 - val_accuracy: 0.4800 Epoch 263/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0829 - accuracy: 0.9899 - val_loss: 1.7370 - val_accuracy: 0.4800 Epoch 264/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0961 - accuracy: 0.9596 - val_loss: 1.7634 - val_accuracy: 0.4800 Epoch 265/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0706 - accuracy: 1.0000 - val_loss: 1.7950 - val_accuracy: 0.4800 Epoch 266/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1538 - accuracy: 0.9091 - val_loss: 1.8190 - val_accuracy: 0.4800 Epoch 267/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1197 - accuracy: 0.9697 - val_loss: 1.8417 - val_accuracy: 0.4800 Epoch 268/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1511 - accuracy: 0.9596 - val_loss: 1.8149 - val_accuracy: 0.4800 Epoch 269/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0665 - accuracy: 0.9899 - val_loss: 1.7783 - val_accuracy: 0.4800 Epoch 270/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1102 - accuracy: 0.9697 - val_loss: 1.7778 - val_accuracy: 0.4800 Epoch 271/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0443 - accuracy: 1.0000 - val_loss: 1.7662 - val_accuracy: 0.4800 Epoch 272/300 5/5 [==============================] - 0s 9ms/step - loss: 0.1157 - accuracy: 0.9697 - val_loss: 1.7285 - val_accuracy: 0.5200 Epoch 273/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0833 - accuracy: 0.9798 - val_loss: 1.5992 - val_accuracy: 0.5200 Epoch 274/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0763 - accuracy: 1.0000 - val_loss: 1.4706 - val_accuracy: 0.5200 Epoch 275/300 5/5 [==============================] - 0s 10ms/step - loss: 0.2336 - accuracy: 0.9091 - val_loss: 1.3813 - val_accuracy: 0.5200 Epoch 276/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1367 - accuracy: 0.9697 - val_loss: 1.3966 - val_accuracy: 0.4800 Epoch 277/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1156 - accuracy: 0.9798 - val_loss: 1.4031 - val_accuracy: 0.5200 Epoch 278/300 5/5 [==============================] - 0s 9ms/step - loss: 0.0500 - accuracy: 0.9899 - val_loss: 1.4059 - val_accuracy: 0.5200 Epoch 279/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0539 - accuracy: 1.0000 - val_loss: 1.4113 - val_accuracy: 0.4800 Epoch 280/300 5/5 [==============================] - 0s 12ms/step - loss: 0.1042 - accuracy: 0.9798 - val_loss: 1.4522 - val_accuracy: 0.5200 Epoch 281/300 5/5 [==============================] - 0s 13ms/step - loss: 0.1352 - accuracy: 0.9495 - val_loss: 1.5379 - val_accuracy: 0.5200 Epoch 282/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0573 - accuracy: 0.9899 - val_loss: 1.6033 - val_accuracy: 0.5200 Epoch 283/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0787 - accuracy: 0.9798 - val_loss: 1.6581 - val_accuracy: 0.5200 Epoch 284/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0795 - accuracy: 0.9899 - val_loss: 1.6786 - val_accuracy: 0.5200 Epoch 285/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0714 - accuracy: 0.9798 - val_loss: 1.6966 - val_accuracy: 0.5200 Epoch 286/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1454 - accuracy: 0.9394 - val_loss: 1.7467 - val_accuracy: 0.5200 Epoch 287/300 5/5 [==============================] - 0s 33ms/step - loss: 0.0836 - accuracy: 0.9798 - val_loss: 1.8209 - val_accuracy: 0.5200 Epoch 288/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0656 - accuracy: 0.9899 - val_loss: 1.8699 - val_accuracy: 0.4800 Epoch 289/300 5/5 [==============================] - 0s 12ms/step - loss: 0.2126 - accuracy: 0.9293 - val_loss: 1.9250 - val_accuracy: 0.4800 Epoch 290/300 5/5 [==============================] - 0s 12ms/step - loss: 0.0797 - accuracy: 0.9798 - val_loss: 1.9840 - val_accuracy: 0.4800 Epoch 291/300 5/5 [==============================] - 0s 12ms/step - loss: 0.0938 - accuracy: 0.9899 - val_loss: 2.0202 - val_accuracy: 0.4800 Epoch 292/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1023 - accuracy: 0.9495 - val_loss: 1.9804 - val_accuracy: 0.4800 Epoch 293/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1322 - accuracy: 0.9596 - val_loss: 1.9602 - val_accuracy: 0.4800 Epoch 294/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1004 - accuracy: 0.9697 - val_loss: 1.9563 - val_accuracy: 0.5200 Epoch 295/300 5/5 [==============================] - 0s 10ms/step - loss: 0.0808 - accuracy: 0.9798 - val_loss: 1.9499 - val_accuracy: 0.5200 Epoch 296/300 5/5 [==============================] - 0s 11ms/step - loss: 0.1155 - accuracy: 0.9495 - val_loss: 1.9419 - val_accuracy: 0.5200 Epoch 297/300 5/5 [==============================] - 0s 11ms/step - loss: 0.0987 - accuracy: 0.9596 - val_loss: 1.9685 - val_accuracy: 0.5200 Epoch 298/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1021 - accuracy: 0.9697 - val_loss: 1.9752 - val_accuracy: 0.4800 Epoch 299/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1520 - accuracy: 0.9495 - val_loss: 1.9830 - val_accuracy: 0.4800 Epoch 300/300 5/5 [==============================] - 0s 10ms/step - loss: 0.1005 - accuracy: 0.9798 - val_loss: 2.0018 - val_accuracy: 0.4800 CPU times: user 20.7 s, sys: 3.82 s, total: 24.5 s Wall time: 17.3 s
That's it! After some quick exploration of the learning
success:
import plotly.graph_objects as go
epoch = np.arange(300) + 1
fig = go.Figure()
# Add traces
fig.add_trace(go.Scatter(x=epoch, y=fit.history['accuracy'],
mode='lines+markers',
name='training set'))
fig.add_trace(go.Scatter(x=epoch, y=fit.history['val_accuracy'],
mode='lines+markers',
name='validation set'))
fig.update_layout(title="Accuracy in training and validation set",
template='plotly_white')
fig.update_xaxes(title_text='Epoch')
fig.update_yaxes(title_text='Accuracy')
fig.show()
#plot(fig, filename = 'acc_eyes.html')
#display(HTML('acc_eyes.html'))
you evaluate it on the test set
:
score_ann, acc_ann = model.evaluate(X_test, y_test,
batch_size=2)
print('Test score:', score_ann)
print('Test accuracy:', acc_ann)
16/16 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.5005 - loss: 1.7891 Test score: 1.9560185670852661 Test accuracy: 0.4838709533214569
Amazing, you did, everything is good...or is it?
Your PI is so happy about this fantastic results that now everyone in the lab should run machine learning analyses
.
Thus you are ordered to give everyone on the lab your analysis script
and a few colleagues try out immediately using the exact same version
of the script
and data
. (I actually asked 6
real people to run it.)
However, all of them get different results...
for the random forest
:
Accuracy
: 0.52, 0.53, 0.55, 0.58, 0.55
MAE
: 1.11, 1.10, 1.12, 1.05, 1.04
and the deep learning analyses
, i.e. the ANN
:
Test score
: 1.55, 1.44, 1.65, 1.54, 0.83
Test accuracy
: 0.55, 0.45, 0.55, 0.55, 0.74
Shocked you rerun your old analyses and it gets worse: your results also differ from the previous run.
print('previous Accuracy = {}, previous MAE = {}, previous Chance = {}'.format(np.round(np.mean(acc_rf), 3),
np.round(np.mean(-mae_rf), 3),
np.round(1/len(labels.unique()), 3)))
acc = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=10)
mae = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=10,
scoring='neg_mean_absolute_error')
print('Accuracy = {}, MAE = {}, Chance = {}'.format(np.round(np.mean(acc), 3),
np.round(np.mean(-mae), 3),
np.round(1/len(labels.unique()), 3)))
previous Accuracy = 0.529, previous MAE = 1.055, previous Chance = 0.167 Accuracy = 0.542, MAE = 1.007, Chance = 0.167
# score, acc = model.evaluate(X_test, y_test,
# batch_size=2)
print('previous Test score:', score_ann)
print('previous Test accuracy:', acc_ann)
fit = model.fit(X_train, y_train, epochs=300, batch_size=20, validation_split=0.2, verbose=0)
score, acc = model.evaluate(X_test, y_test,
batch_size=2)
print('Test score:', score)
print('Test accuracy:', acc)
previous Test score: 1.9560185670852661 previous Test accuracy: 0.4838709533214569 16/16 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.5308 - loss: 2.4836 Test score: 2.6159985065460205 Test accuracy: 0.5161290168762207
What is going on...
via https://c.tenor.com/oKay8GcV660AAAAC/ted-dancon-evil-laugh.gif </small></small></small></small>
The inconvenient truth is: as every other analyses you might run within the field of neuroimaging (or other any research field), there is an substantial amount of factors that contribute to the reproducibility of your machine learning analyses
. Even more so, a lot of the problems are actually elevated due to the complex nature of the analyses
.
But why care about reproducibility in machine learning
at all and how can some of the underlying problems be addressed? Let's have a look...
reproducibility
in machine learning
?¶We all know the reproducibility crisis
in neuroimaging
...
but we also know that neuroimaging
is by far no exception to the rule as many other, basically all, research fields have comparable problems.
This also includes machine learning
...
adapted from [Martina Vilas](https://doi.org/10.5281/zenodo.4740053)</small>
Besides obviously being a major shortcoming as indicated by this quote from Popper (The Logic of Scientific Discovery)
,
Non-reproducible single occurrences are of no significance to science.
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
what are crucial aspects when talking about reproducibility
in machine learning
(with a focus on neuroimaging
)?
adapted from [Suneeta Mall](https://suneeta-mall.github.io/talks/KubeCon_US_2019)</small>
understanding
, explaining
, and debugging
& is crucial to reverse engineering
Why is this important?
machine learning
is very difficult to understand
, explain
and debug
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
Our example:
model
provided us with the obtained accuracy
model
"learned" from our datafeatures
are more important than othersOur problem(s):
accuracy
every time we run our model
model
learned from the datafeatures
are more important than othersadapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
Why is this important?
machine learning
to make predictions
about humans
(e.g. group clustering
, treatment/therapy plans
, disease propagation
, etc.) and unreproducible model predictions
can lead to devastating errors Our example:
model
actually "learned" a way to reproducibly predict
the age
of participants
from their resting state connectome
Our problem(s):
model
actually "learned" in a reproducible
manner as the outcome variesmodel
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
reproducible
(machine learning
) results are more credible
Why is this important?
FAIR
, verifiable
, reliable
, unbiased
& ethical
analyses
and results
results
from machine learning analyses
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
Our example:
age
can be predict from resting state connectomes
in a verifiable
and unbiased
way that is ethical
and FAIR
Our problem(s):
reproducibility
helps with extensibility
of successive steps in a machine learning analyses
Why is this important?
machine learning analyses
(e.g. feature engineering
, layers
of an ANN
, etc.) need to be reproducible
so that the respective analyses
(or pipeline) can be extendedreproducible
subsequent steps like different post-processing
options and augmentation
are not feasible/possibleadapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
Our example:
features
important for the prediction
of age
from the model
augment
our model
with new features
from a different modality (e.g. DTI
, behavior
, etc.)Our problem(s):
features
is actually reliableaugment
our model
because it's current state produces unreproducible
resultsreproducibility
helps with model training
through data generation
Why is this important?
machine learning analyses
needs tremendous amounts of data
to be trained
, evaluated
and tested
ondata sharing
practices and standardization
, as well as quality control
there's actually not enough data
for most planned (or even conducted) analyses
data
(e.g. disorders
/disease
with a low prevalence and/or broad spectra)generation of synthetic data
via machine learning
(e.g. GAN
s) offers amazing possibilities to address this problemsynthetic data
, of the model
need to be reproducible
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
Our example:
generate
more data
based on the one we have (e.g. all data
, important features
, etc.), the generation
of this data
should be reproducible
data
for synthesizing
, the derivation
of this data
should be reproducible
Our problem(s):
model performance
, thus also derivation
of data
is not reproducible
At this point you might think:
via https://c.tenor.com/LOuJtZ-WL3kAAAAC/holy-forking-shirt-shocked.gif</small></small></small>
and you would be right: there are so many things to consider and that can go wrong.
However, we shouldn't give up quite yet and instead have a look at the underlying problems and how we can address them!
reproducible machine learning
¶Here's another inconvenient truth: every single aspect involved in a machine learning analyses
creates variability
and thus entails a major hurdle towards achieving reproducibility
.
This not only entails the computational infrastructure
one is working with but also the data
as well as common practices during the application of these analyses
.
adapted from [Suneeta Mall](https://suneeta-mall.github.io/talks/KubeCon_US_2019)</small>
machine learning
algorithms require intensive computation and thus can take a very long time to runparallelism
& multiple processing units
(CPU
), graphics processing units
(GPU
), tensor processing units
(TPU
), etc.adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
reproducibility
based on the underlying floating-point computations
and appears independent of the hardware
parallelism
on CPU
, both intra-ops
& inter-ops
(within operation/across multiple operations) can produce different outcomes when run repeatedly stream multiprocessing
(SEM
) units in GPU
s, i.e. asynchronous computation
, can lead to variable outcomes during repeated runsarchitectures
can also change model outcomes
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
frameworks
for machine learning analyses
actually don't guarantee 100% reproducibility
, including cudnn
, pytorch
and tensorflow
bugs
which can be hard to find based on high-level API
sadapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
computation
requires a complex stack of software
that quite often interacts and has cross-dependenciesoperating system
, drivers
, python
, python packages
, etc.numerical errors & instabilities
of the software
, e.g. talking about local minima/maxima
& floating-point precision again
errors
may lead unstable functions
towards distinct local minima
via https://brilliant.org/wiki/extrema/</small></small>
print(sum([0.001 for _ in range(1000)]))
1.0000000000000007
machine learning algorithms
introduce further problemsrandomness
can lead to unreproducible
outcomescomputation
can lead to non-deterministic
resultsadapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
software
and algorithm
, practices
and processes
in machine learning
also heavily influence the reproducibility
of resultsrandomness
, because it's basically everywhererandom initializations
random augmentations
random noise introduction
data shuffles
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
Here's just a brief example concerning data shuffling
in training sets
:
for i in range(100):
X_train, X_test, y_train, y_test = train_test_split(data, pd.Categorical(labels).codes, test_size=0.2, shuffle=True)
pd.Series(y_train).plot(kind='hist', stacked=True)
machine learning analysis
need tremendous amounts of data
, the buzz word is "big data
"data standardization & management
are still not frequently applied, even in small datasets
, "big data
" poses additional major problems data
, especially "big data
" various aspects need to be addresseddata management
data provenance
data poisoning
under-/over-represented data
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
machine learning analyses
, e.g. feature engineering
, augmentation
, etc.models
models
is used more than once, which is grounded in two main factors (among others)trained models
training models
can take a very long time as previously outlined and additionally consume large amounts of resourcestrained models
or weights
thereof is necessary to reduce these costs and further validate the models
in terms of reproducibility
, reliability
, robustness
and generalization
machine learning
is more or less affected by concept drift
concept
of the world that surrounds us and the things within is constantly changing (e.g. disease
/ disorder
subtypes and progressions, etc.)biological agents
need to make artificial agents
aware of thismachine learning
this process is referred to as continual learning
adapted from [Suneeta Mall](https://suneeta-mall.github.io/2019/12/21/Reproducible-ml-research-n-industry.html)</small>
At this point you might think:
via [GIPHY](https://media2.giphy.com/media/cRMGqNpvm9XS2gRcpL/giphy.gif)</small></small></small>
But again: there's something we can do to make it at least slightly better. Thus, let's get to it!
reproducibility
in machine learning
¶reproducibility
in neuroimaging
can also be utilized to increase reproducibility
in machine learning
this thus also entails tools and resources provided by ReproNim
and its members, as well as adjacent initiatives
in more detail we will have a look at the following challenges and respectively helpful aspects
software
: virtualization
of computing environments
using neurodockeralgorithm
/practices
/processes
: dealing with randomness
data
: standardization
(BIDS), tracking everything (git/github & datalad), sharingReproNim
doesn't offer to hardware
resources yet, so we have to keep that aside for now virtualization
of computing environments
using neurodocker¶computing environment
is composed of many different aspectsoperating system
, drivers
, python
, python packages
versions
thereofObviously, we can't send around our machines via post.
So what can we do to address this?
We can make use of virtualization technologies
that aim to:
Overall, there are several types
of virtualization
that are situated across different levels
:
software containers
like docker & singularity
virtual machines
like Virtualbox & VMware
As discussed before, machine learning analyses
need a lot of computational resources
to run in a feasible amount of time. Thus, usually we want to make use of HPC
s which excludes virtual machines
from the suitable options, as their handling, management and resource allocation don't really scale.
In contrast, both python virtualization
and software containers
work reasonably well on HPC
s (regarding utilization, management and resource allocation).
The creation, combination and management of both is made incredibly easy and even reproducible
via neurodocker!
containerized software
aimed at the reproducible generation
of other containerized software
python virtualization
using conda
, docker
and singularity
, as well as neuroimaging-specific software
You might wonder: "why do we need both"?
python virtualization
does not account for often required libraries
and binaries
on other levels (beyond python
)virtualization
, i.e. software containers
that share the host system's kernel
and (can) include everything from libraries
and binaries
upwardsIn order to create a software container
, here docker
, that entails a python environment
with all packages
in the version
s that were used to run the initial analyses
, we only need to do the following:
%%bash
docker run kaczmarj/neurodocker:0.6.0 generate docker --base=ubuntu:18.04 \
--pkg-manager=apt \
--install git nano unzip \
--miniconda \
version=latest \
create_env='repronim_ml' \
activate=true \
conda_install="python=3.11.0 numpy=1.26.4 pandas=2.3.0 scikit-learn=1.7.0 seaborn=0.13.2" \
pip_install="tensorflow==2.16.2 datalad[full]==1.2.0" \
--add-to-entrypoint "source activate repronim_ml" \
--entrypoint "/neurodocker/startup.sh python" > Dockerfile
docker build -t peerherholz/repronim_ml:0.2 .
As this takes a while, we can also just grab the generated container from dockerhub
:
%%bash
docker pull peerherholz/repronim_ml:0.2
0.2: Pulling from peerherholz/repronim_ml Digest: sha256:6fc095cd8c0904383bdc9d76610d6c366010fa7f7db27a1ea14b3babe10912f2 Status: Image is up to date for peerherholz/repronim_ml:0.2 docker.io/peerherholz/repronim_ml:0.2
With that, we already have our software container
ready to go and can run our machine learning analyses
in it:
%%bash
docker images
REPOSITORY TAG IMAGE ID CREATED SIZE peerherholz/repronim_ml 0.2 35867f73b6df 4 days ago 3.51GB kaczmarj/neurodocker 0.6.0 7029883696dd 5 years ago 79.3MB
Here, the software container
is set up in a way that it automatically executes/runs python
, specifically the version
of the conda environment
we created. Thus, we can simply provide a python script
as an input
.
%%bash
docker run -i --rm -v /Users/peerherholz/google_drive/GitHub/repronim_ML/:/data peerherholz/repronim_ml:0.2 /data/code/ml_reproducibility_data.py
2025-06-17 09:50:22.628547: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2025-06-17 09:50:22.809540: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2025-06-17 09:50:23.030043: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:479] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2025-06-17 09:50:23.264731: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:10575] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2025-06-17 09:50:23.266191: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1442] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2025-06-17 09:50:23.587105: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2025-06-17 09:50:28.685347: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT /opt/miniconda-latest/envs/repronim_ml/lib/python3.11/site-packages/keras/src/layers/core/dense.py:93: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead. super().__init__(activity_regularizer=activity_regularizer, **kwargs)
Epoch 1/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 4s 118ms/step - accuracy: 0.1682 - loss: 3.1353 - val_accuracy: 0.1600 - val_loss: 1.7578 Epoch 2/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.2005 - loss: 2.9439 - val_accuracy: 0.2000 - val_loss: 1.7421 Epoch 3/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.1965 - loss: 2.7430 - val_accuracy: 0.2000 - val_loss: 1.7247 Epoch 4/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.2090 - loss: 2.4735 - val_accuracy: 0.2400 - val_loss: 1.7127 Epoch 5/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.3182 - loss: 1.9886 - val_accuracy: 0.2400 - val_loss: 1.7089 Epoch 6/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.2426 - loss: 2.4080 - val_accuracy: 0.2400 - val_loss: 1.7039 Epoch 7/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.3384 - loss: 1.9718 - val_accuracy: 0.3200 - val_loss: 1.6989 Epoch 8/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.3158 - loss: 1.8423 - val_accuracy: 0.2800 - val_loss: 1.6928 Epoch 9/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.3415 - loss: 1.8677 - val_accuracy: 0.2800 - val_loss: 1.6891 Epoch 10/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.4512 - loss: 1.8030 - val_accuracy: 0.2800 - val_loss: 1.6753 Epoch 11/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.3817 - loss: 1.6049 - val_accuracy: 0.3200 - val_loss: 1.6508 Epoch 12/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.3516 - loss: 1.6553 - val_accuracy: 0.3600 - val_loss: 1.6404 Epoch 13/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.4239 - loss: 1.5727 - val_accuracy: 0.3200 - val_loss: 1.6322 Epoch 14/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.3851 - loss: 1.7561 - val_accuracy: 0.3600 - val_loss: 1.6305 Epoch 15/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.3682 - loss: 1.5350 - val_accuracy: 0.3600 - val_loss: 1.6206 Epoch 16/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4902 - loss: 1.5471 - val_accuracy: 0.3600 - val_loss: 1.6122 Epoch 17/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.5094 - loss: 1.3340 - val_accuracy: 0.3200 - val_loss: 1.6052 Epoch 18/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.4912 - loss: 1.5595 - val_accuracy: 0.4000 - val_loss: 1.6026 Epoch 19/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.3739 - loss: 1.7639 - val_accuracy: 0.3600 - val_loss: 1.6006 Epoch 20/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5013 - loss: 1.1987 - val_accuracy: 0.4000 - val_loss: 1.5935 Epoch 21/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5331 - loss: 1.2782 - val_accuracy: 0.4400 - val_loss: 1.5842 Epoch 22/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5036 - loss: 1.3426 - val_accuracy: 0.4400 - val_loss: 1.5732 Epoch 23/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4701 - loss: 1.3816 - val_accuracy: 0.4000 - val_loss: 1.5530 Epoch 24/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.5211 - loss: 1.1790 - val_accuracy: 0.4000 - val_loss: 1.5334 Epoch 25/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.5412 - loss: 1.1271 - val_accuracy: 0.4400 - val_loss: 1.5282 Epoch 26/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.5600 - loss: 1.0860 - val_accuracy: 0.4400 - val_loss: 1.5170 Epoch 27/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.5290 - loss: 1.1954 - val_accuracy: 0.4000 - val_loss: 1.5044 Epoch 28/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.5509 - loss: 1.4474 - val_accuracy: 0.4000 - val_loss: 1.4995 Epoch 29/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.5314 - loss: 1.2312 - val_accuracy: 0.3600 - val_loss: 1.4926 Epoch 30/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5755 - loss: 1.1798 - val_accuracy: 0.4000 - val_loss: 1.4849 Epoch 31/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5441 - loss: 1.1606 - val_accuracy: 0.4000 - val_loss: 1.4548 Epoch 32/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.5601 - loss: 1.1150 - val_accuracy: 0.3600 - val_loss: 1.4320 Epoch 33/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7056 - loss: 0.8754 - val_accuracy: 0.4000 - val_loss: 1.4215 Epoch 34/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.6018 - loss: 1.0017 - val_accuracy: 0.4000 - val_loss: 1.4194 Epoch 35/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5933 - loss: 1.0377 - val_accuracy: 0.4000 - val_loss: 1.4287 Epoch 36/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.6075 - loss: 1.0813 - val_accuracy: 0.4000 - val_loss: 1.4367 Epoch 37/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6903 - loss: 0.9335 - val_accuracy: 0.4000 - val_loss: 1.4264 Epoch 38/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 0.6064 - loss: 0.9795 - val_accuracy: 0.4000 - val_loss: 1.4144 Epoch 39/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.6535 - loss: 0.9267 - val_accuracy: 0.4400 - val_loss: 1.3945 Epoch 40/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.6618 - loss: 0.8390 - val_accuracy: 0.4400 - val_loss: 1.3868 Epoch 41/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.5931 - loss: 1.1217 - val_accuracy: 0.4400 - val_loss: 1.3742 Epoch 42/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6806 - loss: 0.9722 - val_accuracy: 0.4800 - val_loss: 1.3661 Epoch 43/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6320 - loss: 1.0241 - val_accuracy: 0.4800 - val_loss: 1.3652 Epoch 44/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.7433 - loss: 0.8491 - val_accuracy: 0.4800 - val_loss: 1.3620 Epoch 45/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 0.6714 - loss: 0.9081 - val_accuracy: 0.4800 - val_loss: 1.3532 Epoch 46/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.5939 - loss: 1.1492 - val_accuracy: 0.4800 - val_loss: 1.3508 Epoch 47/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.7482 - loss: 0.7812 - val_accuracy: 0.4400 - val_loss: 1.3566 Epoch 48/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7226 - loss: 0.8208 - val_accuracy: 0.4400 - val_loss: 1.3613 Epoch 49/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.7016 - loss: 0.8025 - val_accuracy: 0.4000 - val_loss: 1.3775 Epoch 50/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 0.7320 - loss: 0.8339 - val_accuracy: 0.4000 - val_loss: 1.4084 Epoch 51/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.7005 - loss: 0.8364 - val_accuracy: 0.4000 - val_loss: 1.4357 Epoch 52/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6762 - loss: 0.7686 - val_accuracy: 0.4000 - val_loss: 1.4461 Epoch 53/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.7137 - loss: 0.7955 - val_accuracy: 0.4000 - val_loss: 1.4553 Epoch 54/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7556 - loss: 0.8180 - val_accuracy: 0.4000 - val_loss: 1.4502 Epoch 55/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.7654 - loss: 0.6045 - val_accuracy: 0.4400 - val_loss: 1.4462 Epoch 56/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8291 - loss: 0.6320 - val_accuracy: 0.4400 - val_loss: 1.4435 Epoch 57/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7199 - loss: 0.7511 - val_accuracy: 0.4400 - val_loss: 1.4312 Epoch 58/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7033 - loss: 0.8254 - val_accuracy: 0.4400 - val_loss: 1.4071 Epoch 59/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.7695 - loss: 0.6561 - val_accuracy: 0.4400 - val_loss: 1.3785 Epoch 60/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.7870 - loss: 0.6296 - val_accuracy: 0.4400 - val_loss: 1.3763 Epoch 61/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7481 - loss: 0.6839 - val_accuracy: 0.4400 - val_loss: 1.3676 Epoch 62/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7967 - loss: 0.5983 - val_accuracy: 0.4800 - val_loss: 1.3707 Epoch 63/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7455 - loss: 0.6909 - val_accuracy: 0.4800 - val_loss: 1.3741 Epoch 64/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8012 - loss: 0.6807 - val_accuracy: 0.4400 - val_loss: 1.3727 Epoch 65/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8234 - loss: 0.6263 - val_accuracy: 0.4400 - val_loss: 1.3652 Epoch 66/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8446 - loss: 0.6406 - val_accuracy: 0.4800 - val_loss: 1.3648 Epoch 67/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7894 - loss: 0.6237 - val_accuracy: 0.4800 - val_loss: 1.3598 Epoch 68/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.7871 - loss: 0.6525 - val_accuracy: 0.4800 - val_loss: 1.3772 Epoch 69/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7192 - loss: 0.6362 - val_accuracy: 0.4800 - val_loss: 1.3707 Epoch 70/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7759 - loss: 0.6441 - val_accuracy: 0.4400 - val_loss: 1.3630 Epoch 71/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.7707 - loss: 0.6915 - val_accuracy: 0.4400 - val_loss: 1.3684 Epoch 72/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7626 - loss: 0.6144 - val_accuracy: 0.4400 - val_loss: 1.3707 Epoch 73/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7994 - loss: 0.6160 - val_accuracy: 0.5200 - val_loss: 1.3724 Epoch 74/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.8162 - loss: 0.5677 - val_accuracy: 0.5200 - val_loss: 1.3842 Epoch 75/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7634 - loss: 0.6128 - val_accuracy: 0.5200 - val_loss: 1.3903 Epoch 76/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8559 - loss: 0.5351 - val_accuracy: 0.4800 - val_loss: 1.3947 Epoch 77/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8635 - loss: 0.5372 - val_accuracy: 0.4800 - val_loss: 1.4054 Epoch 78/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.8022 - loss: 0.5249 - val_accuracy: 0.4400 - val_loss: 1.4068 Epoch 79/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7984 - loss: 0.6246 - val_accuracy: 0.4400 - val_loss: 1.3882 Epoch 80/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8045 - loss: 0.6031 - val_accuracy: 0.4400 - val_loss: 1.3914 Epoch 81/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7959 - loss: 0.5813 - val_accuracy: 0.4400 - val_loss: 1.3988 Epoch 82/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8330 - loss: 0.5760 - val_accuracy: 0.4400 - val_loss: 1.4185 Epoch 83/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8401 - loss: 0.5935 - val_accuracy: 0.4000 - val_loss: 1.4369 Epoch 84/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8221 - loss: 0.4579 - val_accuracy: 0.4000 - val_loss: 1.4545 Epoch 85/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8813 - loss: 0.4539 - val_accuracy: 0.4800 - val_loss: 1.4667 Epoch 86/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8930 - loss: 0.3409 - val_accuracy: 0.5200 - val_loss: 1.4703 Epoch 87/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8191 - loss: 0.5205 - val_accuracy: 0.5600 - val_loss: 1.4694 Epoch 88/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.8657 - loss: 0.4383 - val_accuracy: 0.5600 - val_loss: 1.4628 Epoch 89/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.8451 - loss: 0.5128 - val_accuracy: 0.5200 - val_loss: 1.4436 Epoch 90/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8497 - loss: 0.4191 - val_accuracy: 0.4800 - val_loss: 1.4380 Epoch 91/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7938 - loss: 0.5034 - val_accuracy: 0.5200 - val_loss: 1.4310 Epoch 92/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.8123 - loss: 0.5140 - val_accuracy: 0.5600 - val_loss: 1.4223 Epoch 93/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8127 - loss: 0.6025 - val_accuracy: 0.5200 - val_loss: 1.4278 Epoch 94/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8427 - loss: 0.3996 - val_accuracy: 0.5200 - val_loss: 1.4311 Epoch 95/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8847 - loss: 0.4000 - val_accuracy: 0.5600 - val_loss: 1.4091 Epoch 96/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9227 - loss: 0.3516 - val_accuracy: 0.5200 - val_loss: 1.3729 Epoch 97/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8636 - loss: 0.4561 - val_accuracy: 0.5600 - val_loss: 1.3430 Epoch 98/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8509 - loss: 0.4467 - val_accuracy: 0.5600 - val_loss: 1.3440 Epoch 99/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8350 - loss: 0.4824 - val_accuracy: 0.5600 - val_loss: 1.3528 Epoch 100/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9037 - loss: 0.3843 - val_accuracy: 0.5600 - val_loss: 1.3508 Epoch 101/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8866 - loss: 0.3577 - val_accuracy: 0.5600 - val_loss: 1.3487 Epoch 102/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9031 - loss: 0.3693 - val_accuracy: 0.5200 - val_loss: 1.3680 Epoch 103/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8684 - loss: 0.4625 - val_accuracy: 0.5200 - val_loss: 1.3880 Epoch 104/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 0.9286 - loss: 0.3267 - val_accuracy: 0.4800 - val_loss: 1.3967 Epoch 105/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.8628 - loss: 0.4724 - val_accuracy: 0.4400 - val_loss: 1.4042 Epoch 106/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.7921 - loss: 0.5085 - val_accuracy: 0.5200 - val_loss: 1.3996 Epoch 107/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9617 - loss: 0.3162 - val_accuracy: 0.5200 - val_loss: 1.3962 Epoch 108/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.8653 - loss: 0.3951 - val_accuracy: 0.5200 - val_loss: 1.3851 Epoch 109/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9519 - loss: 0.3160 - val_accuracy: 0.5200 - val_loss: 1.3953 Epoch 110/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9395 - loss: 0.2870 - val_accuracy: 0.5200 - val_loss: 1.4044 Epoch 111/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8633 - loss: 0.4702 - val_accuracy: 0.5200 - val_loss: 1.4095 Epoch 112/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9076 - loss: 0.3220 - val_accuracy: 0.5200 - val_loss: 1.4114 Epoch 113/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9416 - loss: 0.3107 - val_accuracy: 0.5200 - val_loss: 1.4087 Epoch 114/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9121 - loss: 0.3656 - val_accuracy: 0.5200 - val_loss: 1.4265 Epoch 115/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9292 - loss: 0.3068 - val_accuracy: 0.5600 - val_loss: 1.4402 Epoch 116/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9109 - loss: 0.2903 - val_accuracy: 0.5600 - val_loss: 1.4421 Epoch 117/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9473 - loss: 0.2135 - val_accuracy: 0.5600 - val_loss: 1.4479 Epoch 118/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9432 - loss: 0.2962 - val_accuracy: 0.5600 - val_loss: 1.4484 Epoch 119/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9199 - loss: 0.2740 - val_accuracy: 0.5600 - val_loss: 1.4531 Epoch 120/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8376 - loss: 0.4256 - val_accuracy: 0.5600 - val_loss: 1.4615 Epoch 121/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8548 - loss: 0.3672 - val_accuracy: 0.5600 - val_loss: 1.4791 Epoch 122/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9335 - loss: 0.2492 - val_accuracy: 0.5600 - val_loss: 1.5080 Epoch 123/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9125 - loss: 0.3021 - val_accuracy: 0.5600 - val_loss: 1.5161 Epoch 124/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9272 - loss: 0.3089 - val_accuracy: 0.5600 - val_loss: 1.4875 Epoch 125/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8879 - loss: 0.3350 - val_accuracy: 0.5600 - val_loss: 1.4810 Epoch 126/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9453 - loss: 0.2478 - val_accuracy: 0.6000 - val_loss: 1.5065 Epoch 127/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9067 - loss: 0.3434 - val_accuracy: 0.5600 - val_loss: 1.5503 Epoch 128/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9741 - loss: 0.2326 - val_accuracy: 0.5600 - val_loss: 1.5893 Epoch 129/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9263 - loss: 0.2967 - val_accuracy: 0.5600 - val_loss: 1.6183 Epoch 130/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8853 - loss: 0.3330 - val_accuracy: 0.5600 - val_loss: 1.6401 Epoch 131/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9230 - loss: 0.2707 - val_accuracy: 0.5600 - val_loss: 1.6607 Epoch 132/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9590 - loss: 0.2739 - val_accuracy: 0.5600 - val_loss: 1.6751 Epoch 133/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 0.8976 - loss: 0.2240 - val_accuracy: 0.5200 - val_loss: 1.6823 Epoch 134/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8217 - loss: 0.4352 - val_accuracy: 0.5200 - val_loss: 1.6856 Epoch 135/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9017 - loss: 0.3107 - val_accuracy: 0.5200 - val_loss: 1.6879 Epoch 136/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9012 - loss: 0.3038 - val_accuracy: 0.5200 - val_loss: 1.7095 Epoch 137/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9672 - loss: 0.2260 - val_accuracy: 0.4800 - val_loss: 1.7371 Epoch 138/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9747 - loss: 0.2218 - val_accuracy: 0.4400 - val_loss: 1.7528 Epoch 139/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9114 - loss: 0.3196 - val_accuracy: 0.4400 - val_loss: 1.7737 Epoch 140/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9366 - loss: 0.2356 - val_accuracy: 0.4400 - val_loss: 1.7818 Epoch 141/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8721 - loss: 0.3389 - val_accuracy: 0.4400 - val_loss: 1.7636 Epoch 142/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9594 - loss: 0.2210 - val_accuracy: 0.4400 - val_loss: 1.7596 Epoch 143/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9466 - loss: 0.2648 - val_accuracy: 0.4000 - val_loss: 1.7570 Epoch 144/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9492 - loss: 0.2704 - val_accuracy: 0.4000 - val_loss: 1.7824 Epoch 145/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 97ms/step - accuracy: 0.8825 - loss: 0.2816 - val_accuracy: 0.4000 - val_loss: 1.7971 Epoch 146/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9073 - loss: 0.3248 - val_accuracy: 0.4000 - val_loss: 1.7860 Epoch 147/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8970 - loss: 0.2725 - val_accuracy: 0.4000 - val_loss: 1.7618 Epoch 148/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8860 - loss: 0.3315 - val_accuracy: 0.4000 - val_loss: 1.7436 Epoch 149/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9604 - loss: 0.2108 - val_accuracy: 0.4400 - val_loss: 1.7253 Epoch 150/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9513 - loss: 0.2220 - val_accuracy: 0.4400 - val_loss: 1.7156 Epoch 151/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9114 - loss: 0.2938 - val_accuracy: 0.4400 - val_loss: 1.6996 Epoch 152/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9395 - loss: 0.2412 - val_accuracy: 0.4000 - val_loss: 1.7315 Epoch 153/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9355 - loss: 0.2394 - val_accuracy: 0.4000 - val_loss: 1.7532 Epoch 154/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9693 - loss: 0.1798 - val_accuracy: 0.4400 - val_loss: 1.7673 Epoch 155/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9436 - loss: 0.2258 - val_accuracy: 0.4000 - val_loss: 1.7779 Epoch 156/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9569 - loss: 0.2305 - val_accuracy: 0.4400 - val_loss: 1.7794 Epoch 157/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9154 - loss: 0.2359 - val_accuracy: 0.4800 - val_loss: 1.7817 Epoch 158/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9532 - loss: 0.1770 - val_accuracy: 0.4400 - val_loss: 1.7789 Epoch 159/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9596 - loss: 0.1705 - val_accuracy: 0.4400 - val_loss: 1.7752 Epoch 160/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9677 - loss: 0.2104 - val_accuracy: 0.4000 - val_loss: 1.7733 Epoch 161/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9231 - loss: 0.2743 - val_accuracy: 0.4800 - val_loss: 1.7401 Epoch 162/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9191 - loss: 0.2555 - val_accuracy: 0.4800 - val_loss: 1.7156 Epoch 163/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9491 - loss: 0.1929 - val_accuracy: 0.4800 - val_loss: 1.7147 Epoch 164/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9499 - loss: 0.2054 - val_accuracy: 0.5200 - val_loss: 1.7278 Epoch 165/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9037 - loss: 0.2433 - val_accuracy: 0.4800 - val_loss: 1.7229 Epoch 166/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9561 - loss: 0.2309 - val_accuracy: 0.4800 - val_loss: 1.7322 Epoch 167/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9562 - loss: 0.1805 - val_accuracy: 0.4800 - val_loss: 1.7309 Epoch 168/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9108 - loss: 0.2556 - val_accuracy: 0.4800 - val_loss: 1.7441 Epoch 169/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9506 - loss: 0.2279 - val_accuracy: 0.4800 - val_loss: 1.7353 Epoch 170/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9128 - loss: 0.2868 - val_accuracy: 0.4400 - val_loss: 1.7180 Epoch 171/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9803 - loss: 0.2242 - val_accuracy: 0.4400 - val_loss: 1.6997 Epoch 172/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9495 - loss: 0.2058 - val_accuracy: 0.4400 - val_loss: 1.6965 Epoch 173/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9438 - loss: 0.1777 - val_accuracy: 0.4400 - val_loss: 1.6893 Epoch 174/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9884 - loss: 0.1455 - val_accuracy: 0.4400 - val_loss: 1.7074 Epoch 175/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9292 - loss: 0.2527 - val_accuracy: 0.4400 - val_loss: 1.7490 Epoch 176/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9555 - loss: 0.2185 - val_accuracy: 0.4400 - val_loss: 1.7617 Epoch 177/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 0.9338 - loss: 0.2598 - val_accuracy: 0.4400 - val_loss: 1.7649 Epoch 178/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9188 - loss: 0.2335 - val_accuracy: 0.4800 - val_loss: 1.7374 Epoch 179/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9222 - loss: 0.3004 - val_accuracy: 0.4800 - val_loss: 1.7207 Epoch 180/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9463 - loss: 0.2106 - val_accuracy: 0.5200 - val_loss: 1.7087 Epoch 181/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9651 - loss: 0.1502 - val_accuracy: 0.5200 - val_loss: 1.7053 Epoch 182/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9210 - loss: 0.2235 - val_accuracy: 0.5200 - val_loss: 1.6973 Epoch 183/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9196 - loss: 0.1936 - val_accuracy: 0.5200 - val_loss: 1.6886 Epoch 184/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9409 - loss: 0.2254 - val_accuracy: 0.5200 - val_loss: 1.6655 Epoch 185/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9876 - loss: 0.1707 - val_accuracy: 0.5200 - val_loss: 1.6473 Epoch 186/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9669 - loss: 0.1621 - val_accuracy: 0.5200 - val_loss: 1.6206 Epoch 187/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.9511 - loss: 0.1728 - val_accuracy: 0.5200 - val_loss: 1.6003 Epoch 188/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.9477 - loss: 0.2683 - val_accuracy: 0.5200 - val_loss: 1.6101 Epoch 189/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9678 - loss: 0.1462 - val_accuracy: 0.4400 - val_loss: 1.6247 Epoch 190/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9532 - loss: 0.1809 - val_accuracy: 0.4400 - val_loss: 1.6527 Epoch 191/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9863 - loss: 0.1381 - val_accuracy: 0.4400 - val_loss: 1.6660 Epoch 192/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9590 - loss: 0.1500 - val_accuracy: 0.4400 - val_loss: 1.6612 Epoch 193/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9479 - loss: 0.2039 - val_accuracy: 0.4800 - val_loss: 1.6578 Epoch 194/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9614 - loss: 0.1535 - val_accuracy: 0.5600 - val_loss: 1.6446 Epoch 195/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9575 - loss: 0.1891 - val_accuracy: 0.5600 - val_loss: 1.6032 Epoch 196/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9718 - loss: 0.1692 - val_accuracy: 0.5200 - val_loss: 1.6021 Epoch 197/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9423 - loss: 0.1646 - val_accuracy: 0.5200 - val_loss: 1.6133 Epoch 198/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9836 - loss: 0.1705 - val_accuracy: 0.5200 - val_loss: 1.6002 Epoch 199/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.9479 - loss: 0.2228 - val_accuracy: 0.4800 - val_loss: 1.6044 Epoch 200/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9711 - loss: 0.1249 - val_accuracy: 0.4800 - val_loss: 1.6158 Epoch 201/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9540 - loss: 0.1728 - val_accuracy: 0.4800 - val_loss: 1.6074 Epoch 202/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 0.9933 - loss: 0.0841 - val_accuracy: 0.4800 - val_loss: 1.5881 Epoch 203/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9726 - loss: 0.1865 - val_accuracy: 0.4800 - val_loss: 1.6038 Epoch 204/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9553 - loss: 0.1842 - val_accuracy: 0.4800 - val_loss: 1.5920 Epoch 205/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9493 - loss: 0.1863 - val_accuracy: 0.4800 - val_loss: 1.5847 Epoch 206/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9402 - loss: 0.1633 - val_accuracy: 0.4800 - val_loss: 1.5903 Epoch 207/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9622 - loss: 0.1467 - val_accuracy: 0.4800 - val_loss: 1.6174 Epoch 208/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9698 - loss: 0.1341 - val_accuracy: 0.4800 - val_loss: 1.6443 Epoch 209/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9504 - loss: 0.1336 - val_accuracy: 0.5200 - val_loss: 1.6680 Epoch 210/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9857 - loss: 0.1032 - val_accuracy: 0.5200 - val_loss: 1.6796 Epoch 211/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9291 - loss: 0.1857 - val_accuracy: 0.5200 - val_loss: 1.6872 Epoch 212/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9590 - loss: 0.1716 - val_accuracy: 0.5200 - val_loss: 1.7035 Epoch 213/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9760 - loss: 0.1141 - val_accuracy: 0.5200 - val_loss: 1.7204 Epoch 214/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9284 - loss: 0.2288 - val_accuracy: 0.5200 - val_loss: 1.7312 Epoch 215/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9527 - loss: 0.1930 - val_accuracy: 0.5200 - val_loss: 1.7652 Epoch 216/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9738 - loss: 0.1549 - val_accuracy: 0.5200 - val_loss: 1.8075 Epoch 217/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9699 - loss: 0.1520 - val_accuracy: 0.5200 - val_loss: 1.8445 Epoch 218/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9945 - loss: 0.1088 - val_accuracy: 0.4800 - val_loss: 1.8559 Epoch 219/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9535 - loss: 0.1899 - val_accuracy: 0.4800 - val_loss: 1.8550 Epoch 220/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9788 - loss: 0.1247 - val_accuracy: 0.4800 - val_loss: 1.8441 Epoch 221/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 0.9739 - loss: 0.1227 - val_accuracy: 0.4800 - val_loss: 1.8228 Epoch 222/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.9835 - loss: 0.1408 - val_accuracy: 0.4800 - val_loss: 1.8083 Epoch 223/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9698 - loss: 0.1506 - val_accuracy: 0.4800 - val_loss: 1.7870 Epoch 224/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9759 - loss: 0.1435 - val_accuracy: 0.4800 - val_loss: 1.7989 Epoch 225/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9678 - loss: 0.1286 - val_accuracy: 0.4800 - val_loss: 1.8306 Epoch 226/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9449 - loss: 0.1851 - val_accuracy: 0.4800 - val_loss: 1.8477 Epoch 227/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9760 - loss: 0.1054 - val_accuracy: 0.4800 - val_loss: 1.8515 Epoch 228/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9747 - loss: 0.1880 - val_accuracy: 0.4800 - val_loss: 1.8326 Epoch 229/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9918 - loss: 0.1185 - val_accuracy: 0.4800 - val_loss: 1.8428 Epoch 230/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9739 - loss: 0.1384 - val_accuracy: 0.4800 - val_loss: 1.8525 Epoch 231/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9664 - loss: 0.1139 - val_accuracy: 0.4800 - val_loss: 1.8439 Epoch 232/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9753 - loss: 0.1252 - val_accuracy: 0.4800 - val_loss: 1.8617 Epoch 233/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 0.9256 - loss: 0.1978 - val_accuracy: 0.4800 - val_loss: 1.8797 Epoch 234/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9554 - loss: 0.1903 - val_accuracy: 0.4800 - val_loss: 1.8780 Epoch 235/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 0.9574 - loss: 0.2129 - val_accuracy: 0.4800 - val_loss: 1.8709 Epoch 236/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9325 - loss: 0.1728 - val_accuracy: 0.4800 - val_loss: 1.8570 Epoch 237/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9458 - loss: 0.1701 - val_accuracy: 0.4800 - val_loss: 1.9068 Epoch 238/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9463 - loss: 0.1629 - val_accuracy: 0.5200 - val_loss: 1.9738 Epoch 239/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9711 - loss: 0.1547 - val_accuracy: 0.5200 - val_loss: 2.0086 Epoch 240/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 0.9739 - loss: 0.0884 - val_accuracy: 0.5200 - val_loss: 2.0243 Epoch 241/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 0.9307 - loss: 0.1824 - val_accuracy: 0.5200 - val_loss: 2.0102 Epoch 242/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9884 - loss: 0.1170 - val_accuracy: 0.5200 - val_loss: 1.9741 Epoch 243/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9635 - loss: 0.1071 - val_accuracy: 0.5200 - val_loss: 1.9609 Epoch 244/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9629 - loss: 0.1138 - val_accuracy: 0.5200 - val_loss: 1.9346 Epoch 245/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9863 - loss: 0.1080 - val_accuracy: 0.4800 - val_loss: 1.9111 Epoch 246/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9196 - loss: 0.1743 - val_accuracy: 0.4800 - val_loss: 1.8706 Epoch 247/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9381 - loss: 0.2220 - val_accuracy: 0.4800 - val_loss: 1.8619 Epoch 248/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9863 - loss: 0.0968 - val_accuracy: 0.4800 - val_loss: 1.9104 Epoch 249/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9876 - loss: 0.0914 - val_accuracy: 0.4800 - val_loss: 1.9224 Epoch 250/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9344 - loss: 0.1362 - val_accuracy: 0.4800 - val_loss: 1.9127 Epoch 251/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9842 - loss: 0.0840 - val_accuracy: 0.4800 - val_loss: 1.8962 Epoch 252/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 0.9594 - loss: 0.1806 - val_accuracy: 0.4800 - val_loss: 1.9164 Epoch 253/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9710 - loss: 0.1100 - val_accuracy: 0.4800 - val_loss: 1.9299 Epoch 254/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 1.0000 - loss: 0.0979 - val_accuracy: 0.4800 - val_loss: 1.9448 Epoch 255/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9945 - loss: 0.1164 - val_accuracy: 0.4800 - val_loss: 1.9665 Epoch 256/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9637 - loss: 0.1083 - val_accuracy: 0.4400 - val_loss: 2.0186 Epoch 257/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9602 - loss: 0.1259 - val_accuracy: 0.4400 - val_loss: 2.0721 Epoch 258/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.9822 - loss: 0.1219 - val_accuracy: 0.4400 - val_loss: 2.1088 Epoch 259/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9945 - loss: 0.0757 - val_accuracy: 0.4400 - val_loss: 2.1283 Epoch 260/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9340 - loss: 0.1783 - val_accuracy: 0.4400 - val_loss: 2.1650 Epoch 261/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.9554 - loss: 0.1607 - val_accuracy: 0.4400 - val_loss: 2.2219 Epoch 262/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 64ms/step - accuracy: 0.9705 - loss: 0.1281 - val_accuracy: 0.4800 - val_loss: 2.2508 Epoch 263/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9706 - loss: 0.1552 - val_accuracy: 0.4400 - val_loss: 2.2681 Epoch 264/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9747 - loss: 0.1009 - val_accuracy: 0.4400 - val_loss: 2.2611 Epoch 265/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9644 - loss: 0.1111 - val_accuracy: 0.4400 - val_loss: 2.2453 Epoch 266/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9491 - loss: 0.1557 - val_accuracy: 0.4400 - val_loss: 2.2341 Epoch 267/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9615 - loss: 0.1102 - val_accuracy: 0.4400 - val_loss: 2.2281 Epoch 268/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9842 - loss: 0.0795 - val_accuracy: 0.4400 - val_loss: 2.2207 Epoch 269/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9292 - loss: 0.1738 - val_accuracy: 0.4000 - val_loss: 2.2206 Epoch 270/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9713 - loss: 0.1195 - val_accuracy: 0.4000 - val_loss: 2.2123 Epoch 271/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9945 - loss: 0.0831 - val_accuracy: 0.4000 - val_loss: 2.2044 Epoch 272/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 1.0000 - loss: 0.0836 - val_accuracy: 0.4000 - val_loss: 2.1910 Epoch 273/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.9918 - loss: 0.0675 - val_accuracy: 0.4000 - val_loss: 2.2084 Epoch 274/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9726 - loss: 0.0856 - val_accuracy: 0.4400 - val_loss: 2.2216 Epoch 275/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9863 - loss: 0.0882 - val_accuracy: 0.4400 - val_loss: 2.2113 Epoch 276/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9918 - loss: 0.0969 - val_accuracy: 0.4400 - val_loss: 2.2084 Epoch 277/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9918 - loss: 0.0748 - val_accuracy: 0.4400 - val_loss: 2.1908 Epoch 278/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.9752 - loss: 0.0963 - val_accuracy: 0.4400 - val_loss: 2.1743 Epoch 279/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9657 - loss: 0.1162 - val_accuracy: 0.4400 - val_loss: 2.1678 Epoch 280/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9793 - loss: 0.1745 - val_accuracy: 0.4400 - val_loss: 2.1788 Epoch 281/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9918 - loss: 0.0517 - val_accuracy: 0.4400 - val_loss: 2.1730 Epoch 282/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9768 - loss: 0.0833 - val_accuracy: 0.4400 - val_loss: 2.1782 Epoch 283/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9793 - loss: 0.1135 - val_accuracy: 0.4800 - val_loss: 2.1915 Epoch 284/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9945 - loss: 0.0743 - val_accuracy: 0.4800 - val_loss: 2.1888 Epoch 285/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9739 - loss: 0.0826 - val_accuracy: 0.4800 - val_loss: 2.2076 Epoch 286/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9623 - loss: 0.1132 - val_accuracy: 0.4800 - val_loss: 2.2208 Epoch 287/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9635 - loss: 0.1063 - val_accuracy: 0.4400 - val_loss: 2.1842 Epoch 288/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9685 - loss: 0.1178 - val_accuracy: 0.4400 - val_loss: 2.1927 Epoch 289/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9739 - loss: 0.0678 - val_accuracy: 0.4400 - val_loss: 2.2109 Epoch 290/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.9793 - loss: 0.1054 - val_accuracy: 0.4400 - val_loss: 2.2429 Epoch 291/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 0.0708 - val_accuracy: 0.4800 - val_loss: 2.2582 Epoch 292/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9490 - loss: 0.1120 - val_accuracy: 0.4800 - val_loss: 2.2430 Epoch 293/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9568 - loss: 0.1357 - val_accuracy: 0.4800 - val_loss: 2.2467 Epoch 294/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 1.0000 - loss: 0.0650 - val_accuracy: 0.4800 - val_loss: 2.2414 Epoch 295/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9966 - loss: 0.0592 - val_accuracy: 0.4800 - val_loss: 2.2279 Epoch 296/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9670 - loss: 0.1184 - val_accuracy: 0.5200 - val_loss: 2.1991 Epoch 297/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 1.0000 - loss: 0.0659 - val_accuracy: 0.5200 - val_loss: 2.1816 Epoch 298/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9586 - loss: 0.1058 - val_accuracy: 0.5200 - val_loss: 2.1999 Epoch 299/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9863 - loss: 0.0953 - val_accuracy: 0.5200 - val_loss: 2.2400 Epoch 300/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9822 - loss: 0.1217 - val_accuracy: 0.5200 - val_loss: 2.2763 16/16 ━━��━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.6417 - loss: 1.2759
WARNING:absl:You are saving your model as an HDF5 file via `model.save()` or `keras.saving.save_model(model)`. This file format is considered legacy. We recommend using instead the native Keras format, e.g. `model.save('my_model.keras')` or `keras.saving.save_model(model, 'my_model.keras')`.
Results - random forest Accuracy = 0.548, MAE = 0.955, Chance = 0.167 Results - ANN Test score: 1.1399775743484497 Test accuracy: 0.7419354915618896 /data/code Epoch 60/300 5/5 [==============================] - 0s 26ms/step - loss: 0.6382 - accuracy: 0.7778 - val_loss: 1.4754 - val_accuracy: 0.4800 Epoch 61/300 5/5 [==============================] - 0s 24ms/step - loss: 0.6680 - accuracy: 0.7677 - val_loss: 1.4642 - val_accuracy: 0.4800 Epoch 62/300 5/5 [==============================] - 0s 21ms/step - loss: 0.6347 - accuracy: 0.7879 - val_loss: 1.4632 - val_accuracy: 0.4400 Epoch 63/300 5/5 [==============================] - 0s 22ms/step - loss: 0.6569 - accuracy: 0.7879 - val_loss: 1.4635 - val_accuracy: 0.4400 Epoch 64/300 5/5 [==============================] - 0s 22ms/step - loss: 0.6712 - accuracy: 0.7677 - val_loss: 1.4728 - val_accuracy: 0.4400 Epoch 65/300 5/5 [==============================] - 0s 26ms/step - loss: 0.6282 - accuracy: 0.7677 - val_loss: 1.4765 - val_accuracy: 0.4800 Epoch 66/300 5/5 [==============================] - 0s 21ms/step - loss: 0.5692 - accuracy: 0.8182 - val_loss: 1.4668 - val_accuracy: 0.4800 Epoch 67/300 5/5 [==============================] - 0s 18ms/step - loss: 0.6544 - accuracy: 0.7475 - val_loss: 1.4472 - val_accuracy: 0.4400 Epoch 68/300 5/5 [==============================] - 0s 20ms/step - loss: 0.6178 - accuracy: 0.7576 - val_loss: 1.4408 - val_accuracy: 0.4400 Epoch 69/300 5/5 [==============================] - 0s 26ms/step - loss: 0.4867 - accuracy: 0.8788 - val_loss: 1.4497 - val_accuracy: 0.4400 Epoch 70/300 5/5 [==============================] - 0s 29ms/step - loss: 0.6016 - accuracy: 0.7980 - val_loss: 1.4630 - val_accuracy: 0.4800 Epoch 71/300 5/5 [==============================] - 0s 59ms/step - loss: 0.4764 - accuracy: 0.9091 - val_loss: 1.4711 - val_accuracy: 0.4800 Epoch 72/300 5/5 [==============================] - 0s 26ms/step - loss: 0.7063 - accuracy: 0.7475 - val_loss: 1.4810 - val_accuracy: 0.4800 Epoch 73/300 5/5 [==============================] - 0s 29ms/step - loss: 0.6103 - accuracy: 0.7879 - val_loss: 1.5053 - val_accuracy: 0.4800 Epoch 74/300 5/5 [==============================] - 0s 36ms/step - loss: 0.5329 - accuracy: 0.8384 - val_loss: 1.5226 - val_accuracy: 0.4800 Epoch 75/300 5/5 [==============================] - 0s 31ms/step - loss: 0.4589 - accuracy: 0.8990 - val_loss: 1.5286 - val_accuracy: 0.4400 Epoch 76/300 5/5 [==============================] - 0s 36ms/step - loss: 0.5072 - accuracy: 0.8586 - val_loss: 1.5289 - val_accuracy: 0.4400 Epoch 77/300 5/5 [==============================] - 0s 33ms/step - loss: 0.5732 - accuracy: 0.8081 - val_loss: 1.5400 - val_accuracy: 0.4400 Epoch 78/300 5/5 [==============================] - 0s 24ms/step - loss: 0.5058 - accuracy: 0.8283 - val_loss: 1.5421 - val_accuracy: 0.4400 Epoch 79/300 5/5 [==============================] - 0s 21ms/step - loss: 0.6746 - accuracy: 0.7778 - val_loss: 1.5150 - val_accuracy: 0.4400 Epoch 80/300 5/5 [==============================] - 0s 23ms/step - loss: 0.4320 - accuracy: 0.8687 - val_loss: 1.5067 - val_accuracy: 0.4400 Epoch 81/300 5/5 [==============================] - 0s 20ms/step - loss: 0.4805 - accuracy: 0.8485 - val_loss: 1.5034 - val_accuracy: 0.4400 Epoch 82/300 5/5 [==============================] - 0s 25ms/step - loss: 0.5437 - accuracy: 0.7879 - val_loss: 1.4879 - val_accuracy: 0.4800 Epoch 83/300 5/5 [==============================] - 0s 20ms/step - loss: 0.5280 - accuracy: 0.7980 - val_loss: 1.4638 - val_accuracy: 0.4800 Epoch 84/300 5/5 [==============================] - 0s 20ms/step - loss: 0.4461 - accuracy: 0.8990 - val_loss: 1.4269 - val_accuracy: 0.5200 Epoch 85/300 5/5 [==============================] - 0s 21ms/step - loss: 0.3540 - accuracy: 0.9394 - val_loss: 1.4056 - val_accuracy: 0.5200 Epoch 86/300 5/5 [==============================] - 0s 21ms/step - loss: 0.4885 - accuracy: 0.8283 - val_loss: 1.3960 - val_accuracy: 0.5200 Epoch 87/300 5/5 [==============================] - 0s 18ms/step - loss: 0.5305 - accuracy: 0.7879 - val_loss: 1.3949 - val_accuracy: 0.5600 Epoch 88/300 5/5 [==============================] - 0s 43ms/step - loss: 0.5332 - accuracy: 0.8182 - val_loss: 1.3816 - val_accuracy: 0.5600 Epoch 89/300 5/5 [==============================] - 0s 18ms/step - loss: 0.4388 - accuracy: 0.8586 - val_loss: 1.3821 - val_accuracy: 0.5200 Epoch 90/300 5/5 [==============================] - 0s 23ms/step - loss: 0.4125 - accuracy: 0.8990 - val_loss: 1.4041 - val_accuracy: 0.4800 Epoch 91/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3661 - accuracy: 0.8889 - val_loss: 1.4176 - val_accuracy: 0.4800 Epoch 92/300 5/5 [==============================] - 0s 23ms/step - loss: 0.4075 - accuracy: 0.8687 - val_loss: 1.4345 - val_accuracy: 0.4800 Epoch 93/300 5/5 [==============================] - 0s 22ms/step - loss: 0.5240 - accuracy: 0.8081 - val_loss: 1.4557 - val_accuracy: 0.4800 Epoch 94/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3782 - accuracy: 0.9192 - val_loss: 1.4813 - val_accuracy: 0.4800 Epoch 95/300 5/5 [==============================] - 0s 18ms/step - loss: 0.4565 - accuracy: 0.8384 - val_loss: 1.4932 - val_accuracy: 0.4800 Epoch 96/300 5/5 [==============================] - 0s 25ms/step - loss: 0.5422 - accuracy: 0.8182 - val_loss: 1.4825 - val_accuracy: 0.5200 Epoch 97/300 5/5 [==============================] - 0s 18ms/step - loss: 0.4135 - accuracy: 0.8889 - val_loss: 1.4715 - val_accuracy: 0.4800 Epoch 98/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3986 - accuracy: 0.8687 - val_loss: 1.4504 - val_accuracy: 0.4400 Epoch 99/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2671 - accuracy: 0.9192 - val_loss: 1.4231 - val_accuracy: 0.4800 Epoch 100/300 5/5 [==============================] - 0s 20ms/step - loss: 0.3856 - accuracy: 0.8990 - val_loss: 1.4067 - val_accuracy: 0.4800 Epoch 101/300 5/5 [==============================] - 0s 21ms/step - loss: 0.3400 - accuracy: 0.9293 - val_loss: 1.4176 - val_accuracy: 0.4800 Epoch 102/300 5/5 [==============================] - 0s 20ms/step - loss: 0.3139 - accuracy: 0.9091 - val_loss: 1.4380 - val_accuracy: 0.4400 Epoch 103/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3681 - accuracy: 0.8990 - val_loss: 1.4465 - val_accuracy: 0.4400 Epoch 104/300 5/5 [==============================] - 0s 25ms/step - loss: 0.3388 - accuracy: 0.8990 - val_loss: 1.4664 - val_accuracy: 0.4400 Epoch 105/300 5/5 [==============================] - 0s 24ms/step - loss: 0.3607 - accuracy: 0.8687 - val_loss: 1.4767 - val_accuracy: 0.4400 Epoch 106/300 5/5 [==============================] - 0s 21ms/step - loss: 0.3158 - accuracy: 0.9091 - val_loss: 1.4761 - val_accuracy: 0.4400 Epoch 107/300 5/5 [==============================] - 0s 21ms/step - loss: 0.3183 - accuracy: 0.9192 - val_loss: 1.4770 - val_accuracy: 0.4400 Epoch 108/300 5/5 [==============================] - 0s 33ms/step - loss: 0.4052 - accuracy: 0.8990 - val_loss: 1.4747 - val_accuracy: 0.4400 Epoch 109/300 5/5 [==============================] - 0s 16ms/step - loss: 0.2697 - accuracy: 0.9192 - val_loss: 1.4624 - val_accuracy: 0.4400 Epoch 110/300 5/5 [==============================] - 0s 18ms/step - loss: 0.3088 - accuracy: 0.9192 - val_loss: 1.4601 - val_accuracy: 0.4800 Epoch 111/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3484 - accuracy: 0.8990 - val_loss: 1.4725 - val_accuracy: 0.4800 Epoch 112/300 5/5 [==============================] - 0s 21ms/step - loss: 0.3351 - accuracy: 0.9394 - val_loss: 1.4908 - val_accuracy: 0.4800 Epoch 113/300 5/5 [==============================] - 0s 21ms/step - loss: 0.2713 - accuracy: 0.9394 - val_loss: 1.5280 - val_accuracy: 0.5600 Epoch 114/300 5/5 [==============================] - 0s 20ms/step - loss: 0.3746 - accuracy: 0.9293 - val_loss: 1.5606 - val_accuracy: 0.5600 Epoch 115/300 5/5 [==============================] - 0s 19ms/step - loss: 0.4083 - accuracy: 0.8788 - val_loss: 1.6617 - val_accuracy: 0.5600 Epoch 116/300 5/5 [==============================] - 0s 18ms/step - loss: 0.3700 - accuracy: 0.8889 - val_loss: 1.7125 - val_accuracy: 0.5600 Epoch 117/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3188 - accuracy: 0.9394 - val_loss: 1.7358 - val_accuracy: 0.5600 Epoch 118/300 5/5 [==============================] - 0s 33ms/step - loss: 0.3176 - accuracy: 0.9091 - val_loss: 1.7320 - val_accuracy: 0.4800 Epoch 119/300 5/5 [==============================] - 0s 24ms/step - loss: 0.4660 - accuracy: 0.8182 - val_loss: 1.7235 - val_accuracy: 0.4800 Epoch 120/300 5/5 [==============================] - 0s 20ms/step - loss: 0.3061 - accuracy: 0.9293 - val_loss: 1.6986 - val_accuracy: 0.5200 Epoch 121/300 5/5 [==============================] - 0s 23ms/step - loss: 0.2786 - accuracy: 0.9293 - val_loss: 1.6593 - val_accuracy: 0.5200 Epoch 122/300 5/5 [==============================] - 0s 24ms/step - loss: 0.3667 - accuracy: 0.8889 - val_loss: 1.6166 - val_accuracy: 0.5200 Epoch 123/300 5/5 [==============================] - 0s 26ms/step - loss: 0.2835 - accuracy: 0.9091 - val_loss: 1.5841 - val_accuracy: 0.5200 Epoch 124/300 5/5 [==============================] - 3s 746ms/step - loss: 0.3190 - accuracy: 0.9091 - val_loss: 1.5669 - val_accuracy: 0.5200 Epoch 125/300 5/5 [==============================] - 0s 34ms/step - loss: 0.3439 - accuracy: 0.8990 - val_loss: 1.5379 - val_accuracy: 0.5200 Epoch 126/300 5/5 [==============================] - 0s 22ms/step - loss: 0.3640 - accuracy: 0.8586 - val_loss: 1.5544 - val_accuracy: 0.5200 Epoch 127/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2586 - accuracy: 0.9394 - val_loss: 1.5859 - val_accuracy: 0.5200 Epoch 128/300 5/5 [==============================] - 0s 18ms/step - loss: 0.2524 - accuracy: 0.9192 - val_loss: 1.6235 - val_accuracy: 0.5200 Epoch 129/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3044 - accuracy: 0.9192 - val_loss: 1.6430 - val_accuracy: 0.5200 Epoch 130/300 5/5 [==============================] - 0s 22ms/step - loss: 0.2697 - accuracy: 0.9394 - val_loss: 1.6382 - val_accuracy: 0.5200 Epoch 131/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3785 - accuracy: 0.8687 - val_loss: 1.6356 - val_accuracy: 0.5200 Epoch 132/300 5/5 [==============================] - 0s 17ms/step - loss: 0.3092 - accuracy: 0.9192 - val_loss: 1.6189 - val_accuracy: 0.5200 Epoch 133/300 5/5 [==============================] - 0s 18ms/step - loss: 0.3073 - accuracy: 0.9293 - val_loss: 1.6435 - val_accuracy: 0.5200 Epoch 134/300 5/5 [==============================] - 0s 18ms/step - loss: 0.3957 - accuracy: 0.8788 - val_loss: 1.6595 - val_accuracy: 0.5200 Epoch 135/300 5/5 [==============================] - 0s 18ms/step - loss: 0.2700 - accuracy: 0.9596 - val_loss: 1.6481 - val_accuracy: 0.5200 Epoch 136/300 5/5 [==============================] - 0s 17ms/step - loss: 0.3006 - accuracy: 0.8990 - val_loss: 1.6399 - val_accuracy: 0.5200 Epoch 137/300 5/5 [==============================] - 0s 21ms/step - loss: 0.2467 - accuracy: 0.9293 - val_loss: 1.6303 - val_accuracy: 0.5200 Epoch 138/300 5/5 [==============================] - 0s 62ms/step - loss: 0.2532 - accuracy: 0.9394 - val_loss: 1.6177 - val_accuracy: 0.5200 Epoch 139/300 5/5 [==============================] - 0s 99ms/step - loss: 0.3245 - accuracy: 0.9192 - val_loss: 1.6024 - val_accuracy: 0.5200 Epoch 140/300 5/5 [==============================] - 0s 34ms/step - loss: 0.2833 - accuracy: 0.9495 - val_loss: 1.5897 - val_accuracy: 0.5200 Epoch 141/300 5/5 [==============================] - 0s 26ms/step - loss: 0.2356 - accuracy: 0.9596 - val_loss: 1.5866 - val_accuracy: 0.5200 Epoch 142/300 5/5 [==============================] - 0s 23ms/step - loss: 0.2142 - accuracy: 0.9495 - val_loss: 1.5829 - val_accuracy: 0.5200 Epoch 143/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2400 - accuracy: 0.9293 - val_loss: 1.5583 - val_accuracy: 0.5200 Epoch 144/300 5/5 [==============================] - 0s 21ms/step - loss: 0.2305 - accuracy: 0.9495 - val_loss: 1.5449 - val_accuracy: 0.5200 Epoch 145/300 5/5 [==============================] - 0s 21ms/step - loss: 0.3261 - accuracy: 0.8990 - val_loss: 1.5515 - val_accuracy: 0.5200 Epoch 146/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2336 - accuracy: 0.9394 - val_loss: 1.5610 - val_accuracy: 0.5200 Epoch 147/300 5/5 [==============================] - 0s 20ms/step - loss: 0.2704 - accuracy: 0.8990 - val_loss: 1.5379 - val_accuracy: 0.5200 Epoch 148/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1975 - accuracy: 0.9495 - val_loss: 1.5355 - val_accuracy: 0.5200 Epoch 149/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2525 - accuracy: 0.9394 - val_loss: 1.5354 - val_accuracy: 0.5200 Epoch 150/300 5/5 [==============================] - 0s 19ms/step - loss: 0.3139 - accuracy: 0.8889 - val_loss: 1.5594 - val_accuracy: 0.5200 Epoch 151/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2404 - accuracy: 0.9394 - val_loss: 1.5952 - val_accuracy: 0.5200 Epoch 152/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2861 - accuracy: 0.8990 - val_loss: 1.6080 - val_accuracy: 0.5200 Epoch 153/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2521 - accuracy: 0.9192 - val_loss: 1.6129 - val_accuracy: 0.5200 Epoch 154/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2426 - accuracy: 0.9091 - val_loss: 1.5906 - val_accuracy: 0.4800 Epoch 155/300 5/5 [==============================] - 0s 22ms/step - loss: 0.2737 - accuracy: 0.9091 - val_loss: 1.5868 - val_accuracy: 0.4800 Epoch 156/300 5/5 [==============================] - 0s 47ms/step - loss: 0.1992 - accuracy: 0.9697 - val_loss: 1.6066 - val_accuracy: 0.4800 Epoch 157/300 5/5 [==============================] - 0s 22ms/step - loss: 0.2505 - accuracy: 0.9394 - val_loss: 1.6199 - val_accuracy: 0.5200 Epoch 158/300 5/5 [==============================] - 0s 24ms/step - loss: 0.2296 - accuracy: 0.9495 - val_loss: 1.6177 - val_accuracy: 0.4800 Epoch 159/300 5/5 [==============================] - 0s 33ms/step - loss: 0.2594 - accuracy: 0.9495 - val_loss: 1.6136 - val_accuracy: 0.5200 Epoch 160/300 5/5 [==============================] - 0s 18ms/step - loss: 0.3104 - accuracy: 0.9091 - val_loss: 1.6182 - val_accuracy: 0.4800 Epoch 161/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2610 - accuracy: 0.9293 - val_loss: 1.6286 - val_accuracy: 0.4800 Epoch 162/300 5/5 [==============================] - 0s 21ms/step - loss: 0.2531 - accuracy: 0.9394 - val_loss: 1.6514 - val_accuracy: 0.5200 Epoch 163/300 5/5 [==============================] - 0s 27ms/step - loss: 0.2071 - accuracy: 0.9495 - val_loss: 1.6649 - val_accuracy: 0.5200 Epoch 164/300 5/5 [==============================] - 0s 18ms/step - loss: 0.1769 - accuracy: 0.9495 - val_loss: 1.6620 - val_accuracy: 0.5200 Epoch 165/300 5/5 [==============================] - 0s 39ms/step - loss: 0.2118 - accuracy: 0.9293 - val_loss: 1.6695 - val_accuracy: 0.4800 Epoch 166/300 5/5 [==============================] - 0s 21ms/step - loss: 0.1538 - accuracy: 0.9697 - val_loss: 1.6751 - val_accuracy: 0.4400 Epoch 167/300 5/5 [==============================] - 0s 16ms/step - loss: 0.2870 - accuracy: 0.9091 - val_loss: 1.7126 - val_accuracy: 0.4800 Epoch 168/300 5/5 [==============================] - 0s 17ms/step - loss: 0.1952 - accuracy: 0.9798 - val_loss: 1.7078 - val_accuracy: 0.4800 Epoch 169/300 5/5 [==============================] - 0s 17ms/step - loss: 0.1896 - accuracy: 0.9394 - val_loss: 1.7024 - val_accuracy: 0.4800 Epoch 170/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2333 - accuracy: 0.9293 - val_loss: 1.7446 - val_accuracy: 0.4800 Epoch 171/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1561 - accuracy: 0.9394 - val_loss: 1.7826 - val_accuracy: 0.4800 Epoch 172/300 5/5 [==============================] - 0s 17ms/step - loss: 0.2490 - accuracy: 0.9091 - val_loss: 1.7849 - val_accuracy: 0.4800 Epoch 173/300 5/5 [==============================] - 0s 16ms/step - loss: 0.1555 - accuracy: 0.9798 - val_loss: 1.7960 - val_accuracy: 0.4800 Epoch 174/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1532 - accuracy: 0.9697 - val_loss: 1.7853 - val_accuracy: 0.4800 Epoch 175/300 5/5 [==============================] - 0s 21ms/step - loss: 0.2059 - accuracy: 0.9596 - val_loss: 1.7926 - val_accuracy: 0.4800 Epoch 176/300 5/5 [==============================] - 0s 26ms/step - loss: 0.1927 - accuracy: 0.9798 - val_loss: 1.8254 - val_accuracy: 0.5200 Epoch 177/300 5/5 [==============================] - 0s 36ms/step - loss: 0.1440 - accuracy: 0.9899 - val_loss: 1.8404 - val_accuracy: 0.5200 Epoch 178/300 5/5 [==============================] - 0s 32ms/step - loss: 0.1524 - accuracy: 0.9596 - val_loss: 1.8214 - val_accuracy: 0.5200 Epoch 179/300 5/5 [==============================] - 0s 41ms/step - loss: 0.2181 - accuracy: 0.9495 - val_loss: 1.8011 - val_accuracy: 0.5200 Epoch 180/300 5/5 [==============================] - 0s 39ms/step - loss: 0.2524 - accuracy: 0.9394 - val_loss: 1.7382 - val_accuracy: 0.5600 Epoch 181/300 5/5 [==============================] - 0s 29ms/step - loss: 0.1701 - accuracy: 0.9495 - val_loss: 1.6697 - val_accuracy: 0.5600 Epoch 182/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1604 - accuracy: 0.9697 - val_loss: 1.6532 - val_accuracy: 0.5600 Epoch 183/300 5/5 [==============================] - 0s 32ms/step - loss: 0.2252 - accuracy: 0.9293 - val_loss: 1.6774 - val_accuracy: 0.5600 Epoch 184/300 5/5 [==============================] - 0s 24ms/step - loss: 0.2065 - accuracy: 0.9394 - val_loss: 1.7562 - val_accuracy: 0.5600 Epoch 185/300 5/5 [==============================] - 0s 24ms/step - loss: 0.1924 - accuracy: 0.9596 - val_loss: 1.8072 - val_accuracy: 0.5200 Epoch 186/300 5/5 [==============================] - 0s 30ms/step - loss: 0.2134 - accuracy: 0.9293 - val_loss: 1.8075 - val_accuracy: 0.5600 Epoch 187/300 5/5 [==============================] - 0s 32ms/step - loss: 0.2161 - accuracy: 0.9495 - val_loss: 1.7895 - val_accuracy: 0.5600 Epoch 188/300 5/5 [==============================] - 0s 26ms/step - loss: 0.1742 - accuracy: 0.9495 - val_loss: 1.7450 - val_accuracy: 0.5600 Epoch 189/300 5/5 [==============================] - 0s 25ms/step - loss: 0.1636 - accuracy: 0.9798 - val_loss: 1.7099 - val_accuracy: 0.5600 Epoch 190/300 5/5 [==============================] - 0s 23ms/step - loss: 0.3090 - accuracy: 0.9293 - val_loss: 1.6729 - val_accuracy: 0.5200 Epoch 191/300 5/5 [==============================] - 0s 30ms/step - loss: 0.1842 - accuracy: 0.9596 - val_loss: 1.6475 - val_accuracy: 0.5600 Epoch 192/300 5/5 [==============================] - 0s 25ms/step - loss: 0.1598 - accuracy: 0.9596 - val_loss: 1.6821 - val_accuracy: 0.5600 Epoch 193/300 5/5 [==============================] - 0s 27ms/step - loss: 0.1984 - accuracy: 0.9394 - val_loss: 1.7511 - val_accuracy: 0.5200 Epoch 194/300 5/5 [==============================] - 0s 23ms/step - loss: 0.1187 - accuracy: 0.9798 - val_loss: 1.7835 - val_accuracy: 0.4800 Epoch 195/300 5/5 [==============================] - 0s 24ms/step - loss: 0.1981 - accuracy: 0.9596 - val_loss: 1.7960 - val_accuracy: 0.4800 Epoch 196/300 5/5 [==============================] - 0s 25ms/step - loss: 0.1695 - accuracy: 0.9192 - val_loss: 1.8079 - val_accuracy: 0.4800 Epoch 197/300 5/5 [==============================] - 0s 37ms/step - loss: 0.3015 - accuracy: 0.9293 - val_loss: 1.8297 - val_accuracy: 0.4800 Epoch 198/300 5/5 [==============================] - 0s 22ms/step - loss: 0.2101 - accuracy: 0.9394 - val_loss: 1.9265 - val_accuracy: 0.4800 Epoch 199/300 5/5 [==============================] - 0s 47ms/step - loss: 0.1944 - accuracy: 0.9394 - val_loss: 1.9981 - val_accuracy: 0.4800 Epoch 200/300 5/5 [==============================] - 0s 28ms/step - loss: 0.1151 - accuracy: 0.9899 - val_loss: 2.0507 - val_accuracy: 0.4800 Epoch 201/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1830 - accuracy: 0.9495 - val_loss: 2.0492 - val_accuracy: 0.5200 Epoch 202/300 5/5 [==============================] - 0s 21ms/step - loss: 0.1589 - accuracy: 0.9697 - val_loss: 2.0111 - val_accuracy: 0.5200 Epoch 203/300 5/5 [==============================] - 0s 30ms/step - loss: 0.1687 - accuracy: 0.9697 - val_loss: 1.9711 - val_accuracy: 0.5200 Epoch 204/300 5/5 [==============================] - 0s 18ms/step - loss: 0.1239 - accuracy: 0.9899 - val_loss: 1.9481 - val_accuracy: 0.5200 Epoch 205/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1560 - accuracy: 0.9697 - val_loss: 1.9753 - val_accuracy: 0.5200 Epoch 206/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2117 - accuracy: 0.9394 - val_loss: 2.0193 - val_accuracy: 0.5200 Epoch 207/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1779 - accuracy: 0.9697 - val_loss: 2.0447 - val_accuracy: 0.5200 Epoch 208/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1375 - accuracy: 0.9495 - val_loss: 2.0655 - val_accuracy: 0.4800 Epoch 209/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1451 - accuracy: 0.9596 - val_loss: 2.0091 - val_accuracy: 0.5200 Epoch 210/300 5/5 [==============================] - 0s 21ms/step - loss: 0.2466 - accuracy: 0.8990 - val_loss: 1.9654 - val_accuracy: 0.5200 Epoch 211/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1876 - accuracy: 0.9697 - val_loss: 1.9542 - val_accuracy: 0.5200 Epoch 212/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1487 - accuracy: 0.9798 - val_loss: 1.9658 - val_accuracy: 0.5200 Epoch 213/300 5/5 [==============================] - 0s 19ms/step - loss: 0.0910 - accuracy: 1.0000 - val_loss: 1.9792 - val_accuracy: 0.5200 Epoch 214/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1711 - accuracy: 0.9495 - val_loss: 1.9146 - val_accuracy: 0.5200 Epoch 215/300 5/5 [==============================] - 0s 52ms/step - loss: 0.1351 - accuracy: 0.9697 - val_loss: 1.8869 - val_accuracy: 0.5200 Epoch 216/300 5/5 [==============================] - 0s 31ms/step - loss: 0.1859 - accuracy: 0.9394 - val_loss: 1.9095 - val_accuracy: 0.4800 Epoch 217/300 5/5 [==============================] - 0s 53ms/step - loss: 0.1154 - accuracy: 0.9798 - val_loss: 1.9684 - val_accuracy: 0.4800 Epoch 218/300 5/5 [==============================] - 0s 28ms/step - loss: 0.1476 - accuracy: 0.9596 - val_loss: 1.9833 - val_accuracy: 0.4400 Epoch 219/300 5/5 [==============================] - 0s 23ms/step - loss: 0.1471 - accuracy: 0.9697 - val_loss: 1.9770 - val_accuracy: 0.4400 Epoch 220/300 5/5 [==============================] - 0s 30ms/step - loss: 0.1516 - accuracy: 0.9394 - val_loss: 1.9160 - val_accuracy: 0.4400 Epoch 221/300 5/5 [==============================] - 0s 27ms/step - loss: 0.1698 - accuracy: 0.9596 - val_loss: 1.8504 - val_accuracy: 0.4400 Epoch 222/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1568 - accuracy: 0.9495 - val_loss: 1.7817 - val_accuracy: 0.4400 Epoch 223/300 5/5 [==============================] - 0s 24ms/step - loss: 0.1569 - accuracy: 0.9798 - val_loss: 1.7001 - val_accuracy: 0.4400 Epoch 224/300 5/5 [==============================] - 0s 29ms/step - loss: 0.1278 - accuracy: 0.9596 - val_loss: 1.6547 - val_accuracy: 0.4800 Epoch 225/300 5/5 [==============================] - 0s 77ms/step - loss: 0.2211 - accuracy: 0.9495 - val_loss: 1.6727 - val_accuracy: 0.4800 Epoch 226/300 5/5 [==============================] - 0s 44ms/step - loss: 0.1124 - accuracy: 0.9596 - val_loss: 1.6947 - val_accuracy: 0.4800 Epoch 227/300 5/5 [==============================] - 0s 50ms/step - loss: 0.1116 - accuracy: 0.9697 - val_loss: 1.7238 - val_accuracy: 0.4800 Epoch 228/300 5/5 [==============================] - 0s 68ms/step - loss: 0.1127 - accuracy: 0.9697 - val_loss: 1.7466 - val_accuracy: 0.4800 Epoch 229/300 5/5 [==============================] - 0s 37ms/step - loss: 0.1064 - accuracy: 0.9798 - val_loss: 1.7209 - val_accuracy: 0.4800 Epoch 230/300 5/5 [==============================] - 0s 32ms/step - loss: 0.2024 - accuracy: 0.9192 - val_loss: 1.6785 - val_accuracy: 0.4800 Epoch 231/300 5/5 [==============================] - 0s 56ms/step - loss: 0.1578 - accuracy: 0.9596 - val_loss: 1.6216 - val_accuracy: 0.5200 Epoch 232/300 5/5 [==============================] - 0s 37ms/step - loss: 0.2021 - accuracy: 0.9192 - val_loss: 1.5716 - val_accuracy: 0.5600 Epoch 233/300 5/5 [==============================] - 0s 31ms/step - loss: 0.1306 - accuracy: 0.9798 - val_loss: 1.5521 - val_accuracy: 0.5600 Epoch 234/300 5/5 [==============================] - 0s 99ms/step - loss: 0.1262 - accuracy: 0.9697 - val_loss: 1.5515 - val_accuracy: 0.5600 Epoch 235/300 5/5 [==============================] - 0s 47ms/step - loss: 0.1194 - accuracy: 0.9596 - val_loss: 1.5623 - val_accuracy: 0.5200 Epoch 236/300 5/5 [==============================] - 0s 26ms/step - loss: 0.1293 - accuracy: 0.9798 - val_loss: 1.5481 - val_accuracy: 0.5200 Epoch 237/300 5/5 [==============================] - 0s 20ms/step - loss: 0.2025 - accuracy: 0.9394 - val_loss: 1.5346 - val_accuracy: 0.5200 Epoch 238/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1270 - accuracy: 0.9798 - val_loss: 1.5373 - val_accuracy: 0.5200 Epoch 239/300 5/5 [==============================] - 0s 20ms/step - loss: 0.0996 - accuracy: 0.9899 - val_loss: 1.5452 - val_accuracy: 0.5200 Epoch 240/300 5/5 [==============================] - 0s 44ms/step - loss: 0.1174 - accuracy: 0.9798 - val_loss: 1.5536 - val_accuracy: 0.4800 Epoch 241/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1661 - accuracy: 0.9293 - val_loss: 1.5831 - val_accuracy: 0.5200 Epoch 242/300 5/5 [==============================] - 0s 23ms/step - loss: 0.1051 - accuracy: 0.9899 - val_loss: 1.5977 - val_accuracy: 0.5200 Epoch 243/300 5/5 [==============================] - 0s 23ms/step - loss: 0.2876 - accuracy: 0.9192 - val_loss: 1.6045 - val_accuracy: 0.5200 Epoch 244/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1084 - accuracy: 0.9697 - val_loss: 1.6032 - val_accuracy: 0.5200 Epoch 245/300 5/5 [==============================] - 0s 24ms/step - loss: 0.1702 - accuracy: 0.9495 - val_loss: 1.6282 - val_accuracy: 0.5600 Epoch 246/300 5/5 [==============================] - 0s 21ms/step - loss: 0.1006 - accuracy: 0.9899 - val_loss: 1.6485 - val_accuracy: 0.5200 Epoch 247/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1297 - accuracy: 0.9798 - val_loss: 1.6790 - val_accuracy: 0.5200 Epoch 248/300 5/5 [==============================] - 0s 22ms/step - loss: 0.1304 - accuracy: 0.9798 - val_loss: 1.7028 - val_accuracy: 0.5200 Epoch 249/300 5/5 [==============================] - 0s 19ms/step - loss: 0.0794 - accuracy: 0.9899 - val_loss: 1.7001 - val_accuracy: 0.4800 Epoch 250/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1233 - accuracy: 0.9697 - val_loss: 1.7167 - val_accuracy: 0.4800 Epoch 251/300 5/5 [==============================] - 0s 24ms/step - loss: 0.0742 - accuracy: 1.0000 - val_loss: 1.7406 - val_accuracy: 0.4800 Epoch 252/300 5/5 [==============================] - 0s 22ms/step - loss: 0.0992 - accuracy: 0.9798 - val_loss: 1.7627 - val_accuracy: 0.5200 Epoch 253/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1192 - accuracy: 0.9697 - val_loss: 1.7720 - val_accuracy: 0.5200 Epoch 254/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1513 - accuracy: 0.9495 - val_loss: 1.7890 - val_accuracy: 0.5200 Epoch 255/300 5/5 [==============================] - 0s 21ms/step - loss: 0.0901 - accuracy: 0.9697 - val_loss: 1.8455 - val_accuracy: 0.5200 Epoch 256/300 5/5 [==============================] - 0s 19ms/step - loss: 0.2307 - accuracy: 0.9293 - val_loss: 1.8612 - val_accuracy: 0.5200 Epoch 257/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1277 - accuracy: 0.9495 - val_loss: 1.8896 - val_accuracy: 0.5200 Epoch 258/300 5/5 [==============================] - 0s 21ms/step - loss: 0.1201 - accuracy: 0.9798 - val_loss: 1.9587 - val_accuracy: 0.5200 Epoch 259/300 5/5 [==============================] - 0s 38ms/step - loss: 0.1688 - accuracy: 0.9495 - val_loss: 1.9792 - val_accuracy: 0.5200 Epoch 260/300 5/5 [==============================] - 0s 26ms/step - loss: 0.1207 - accuracy: 0.9697 - val_loss: 1.9841 - val_accuracy: 0.5200 Epoch 261/300 5/5 [==============================] - 0s 41ms/step - loss: 0.1767 - accuracy: 0.9495 - val_loss: 1.9249 - val_accuracy: 0.5200 Epoch 262/300 5/5 [==============================] - 0s 32ms/step - loss: 0.1850 - accuracy: 0.9192 - val_loss: 1.8965 - val_accuracy: 0.5200 Epoch 263/300 5/5 [==============================] - 0s 31ms/step - loss: 0.1054 - accuracy: 0.9697 - val_loss: 1.9131 - val_accuracy: 0.5200 Epoch 264/300 5/5 [==============================] - 0s 21ms/step - loss: 0.0937 - accuracy: 0.9798 - val_loss: 1.9372 - val_accuracy: 0.5200 Epoch 265/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1424 - accuracy: 0.9495 - val_loss: 2.0135 - val_accuracy: 0.4800 Epoch 266/300 5/5 [==============================] - 0s 19ms/step - loss: 0.1176 - accuracy: 0.9697 - val_loss: 2.0486 - val_accuracy: 0.4800 Epoch 267/300 5/5 [==============================] - 0s 19ms/step - loss: 0.0981 - accuracy: 0.9697 - val_loss: 2.0545 - val_accuracy: 0.4800 Epoch 268/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1211 - accuracy: 0.9697 - val_loss: 2.0427 - val_accuracy: 0.4800 Epoch 269/300 5/5 [==============================] - 0s 21ms/step - loss: 0.0497 - accuracy: 1.0000 - val_loss: 2.0297 - val_accuracy: 0.4800 Epoch 270/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1232 - accuracy: 0.9596 - val_loss: 2.0169 - val_accuracy: 0.4800 Epoch 271/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1141 - accuracy: 0.9596 - val_loss: 2.0345 - val_accuracy: 0.4800 Epoch 272/300 5/5 [==============================] - 0s 21ms/step - loss: 0.1652 - accuracy: 0.9394 - val_loss: 2.0586 - val_accuracy: 0.4800 Epoch 273/300 5/5 [==============================] - 0s 23ms/step - loss: 0.2342 - accuracy: 0.9394 - val_loss: 2.1398 - val_accuracy: 0.4800 Epoch 274/300 5/5 [==============================] - 0s 33ms/step - loss: 0.0592 - accuracy: 0.9899 - val_loss: 2.1419 - val_accuracy: 0.4800 Epoch 275/300 5/5 [==============================] - 0s 21ms/step - loss: 0.0914 - accuracy: 0.9899 - val_loss: 2.1469 - val_accuracy: 0.4800 Epoch 276/300 5/5 [==============================] - 0s 40ms/step - loss: 0.0799 - accuracy: 0.9798 - val_loss: 2.1584 - val_accuracy: 0.4800 Epoch 277/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1161 - accuracy: 0.9697 - val_loss: 2.1609 - val_accuracy: 0.4800 Epoch 278/300 5/5 [==============================] - 0s 25ms/step - loss: 0.1281 - accuracy: 0.9899 - val_loss: 2.1691 - val_accuracy: 0.4800 Epoch 279/300 5/5 [==============================] - 0s 15ms/step - loss: 0.0577 - accuracy: 1.0000 - val_loss: 2.1777 - val_accuracy: 0.4800 Epoch 280/300 5/5 [==============================] - 0s 17ms/step - loss: 0.1265 - accuracy: 0.9697 - val_loss: 2.1895 - val_accuracy: 0.4800 Epoch 281/300 5/5 [==============================] - 0s 17ms/step - loss: 0.1053 - accuracy: 0.9899 - val_loss: 2.1885 - val_accuracy: 0.4800 Epoch 282/300 5/5 [==============================] - 0s 20ms/step - loss: 0.1698 - accuracy: 0.9596 - val_loss: 2.1813 - val_accuracy: 0.5200 Epoch 283/300 5/5 [==============================] - 0s 19ms/step - loss: 0.0916 - accuracy: 0.9697 - val_loss: 2.1868 - val_accuracy: 0.5200 Epoch 284/300 5/5 [==============================] - 0s 17ms/step - loss: 0.1421 - accuracy: 0.9899 - val_loss: 2.1533 - val_accuracy: 0.5200 Epoch 285/300 5/5 [==============================] - 0s 18ms/step - loss: 0.0962 - accuracy: 0.9596 - val_loss: 2.1248 - val_accuracy: 0.5600 Epoch 286/300 5/5 [==============================] - 0s 18ms/step - loss: 0.1090 - accuracy: 0.9798 - val_loss: 2.0965 - val_accuracy: 0.5600 Epoch 287/300 5/5 [==============================] - 0s 17ms/step - loss: 0.0947 - accuracy: 0.9798 - val_loss: 2.0884 - val_accuracy: 0.5200 Epoch 288/300 5/5 [==============================] - 0s 19ms/step - loss: 0.0968 - accuracy: 0.9596 - val_loss: 2.1103 - val_accuracy: 0.5200 Epoch 289/300 5/5 [==============================] - 0s 18ms/step - loss: 0.1396 - accuracy: 0.9394 - val_loss: 2.1364 - val_accuracy: 0.5200 Epoch 290/300 5/5 [==============================] - 0s 20ms/step - loss: 0.0905 - accuracy: 0.9697 - val_loss: 2.1681 - val_accuracy: 0.5200 Epoch 291/300 5/5 [==============================] - 0s 24ms/step - loss: 0.1132 - accuracy: 0.9596 - val_loss: 2.1763 - val_accuracy: 0.5200 Epoch 292/300 5/5 [==============================] - 0s 21ms/step - loss: 0.0774 - accuracy: 0.9899 - val_loss: 2.1721 - val_accuracy: 0.5200 Epoch 293/300 5/5 [==============================] - 0s 16ms/step - loss: 0.0694 - accuracy: 0.9899 - val_loss: 2.1385 - val_accuracy: 0.5200 Epoch 294/300 5/5 [==============================] - 0s 15ms/step - loss: 0.1353 - accuracy: 0.9596 - val_loss: 2.1117 - val_accuracy: 0.5200 Epoch 295/300 5/5 [==============================] - 0s 16ms/step - loss: 0.0913 - accuracy: 0.9899 - val_loss: 2.0809 - val_accuracy: 0.5200 Epoch 296/300 5/5 [==============================] - 0s 15ms/step - loss: 0.1546 - accuracy: 0.9394 - val_loss: 2.1287 - val_accuracy: 0.5200 Epoch 297/300 5/5 [==============================] - 0s 15ms/step - loss: 0.0776 - accuracy: 0.9798 - val_loss: 2.1958 - val_accuracy: 0.5600 Epoch 298/300 5/5 [==============================] - 0s 15ms/step - loss: 0.1065 - accuracy: 0.9798 - val_loss: 2.2366 - val_accuracy: 0.5600 Epoch 299/300 5/5 [==============================] - 0s 15ms/step - loss: 0.1050 - accuracy: 0.9798 - val_loss: 2.2278 - val_accuracy: 0.5600 Epoch 300/300 5/5 [==============================] - 0s 38ms/step - loss: 0.1179 - accuracy: 0.9596 - val_loss: 2.2078 - val_accuracy: 0.5200 16/16 [==============================] - 0s 2ms/step - loss: 1.1637 - accuracy: 0.7097 Results - random forest Accuracy = 0.548, MAE = 0.955, Chance = 0.167 Results - ANN Test score: 1.1637402772903442 Test accuracy: 0.7096773982048035 /data/code
Fantastic, now we can run and further test our machine learning analyses
in a dedicated computing environment
that is isolated and shareable!
Now that a large portion of the software dependencies
are addressed, we can continue with the next part: algorithm
/practices
/processes
!
algorithm
/practices
/processes
: dealing with randomness¶randomness
is basically everywhere in the world of machine learning
unreproducible
resultsSo what can we do?
Simply put, we need to seed
the randomness
in our machine learning analyses
!
software
has the option to set the seed
for randomness
, one way or anotherseed
can be set unfortunately varies between software packages
seed
needs to be set at different instancesFor example, in our machine learning analyses
using a random forest
, we didn't seed
randomness
so far, resulting in different outcomes at every run because e.g. training
and test set splits
would vary:
from sklearn.model_selection import StratifiedKFold
cv = StratifiedKFold()
pipe = make_pipeline(
StandardScaler(),
RandomForestClassifier()
)
for i in range(10):
acc = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=cv)
mae = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=cv,
scoring='neg_mean_absolute_error')
print('Accuracy run {} = {}, MAE run {} = {}, Chance run {} = {}'.format(i, np.round(np.mean(acc), 3),
i, np.round(np.mean(-mae), 3),
i, np.round(1/len(labels.unique()), 3)))
Accuracy run 0 = 0.535, MAE run 0 = 1.032, Chance run 0 = 0.167 Accuracy run 1 = 0.568, MAE run 1 = 1.142, Chance run 1 = 0.167 Accuracy run 2 = 0.51, MAE run 2 = 1.032, Chance run 2 = 0.167 Accuracy run 3 = 0.535, MAE run 3 = 1.097, Chance run 3 = 0.167 Accuracy run 4 = 0.49, MAE run 4 = 1.026, Chance run 4 = 0.167 Accuracy run 5 = 0.535, MAE run 5 = 0.961, Chance run 5 = 0.167 Accuracy run 6 = 0.529, MAE run 6 = 1.129, Chance run 6 = 0.167 Accuracy run 7 = 0.568, MAE run 7 = 1.09, Chance run 7 = 0.167 Accuracy run 8 = 0.581, MAE run 8 = 1.11, Chance run 8 = 0.167 Accuracy run 9 = 0.568, MAE run 9 = 1.006, Chance run 9 = 0.167
Using the random_state
argument
in all functions
/classes
that deal with randomness
to set the seed
, results in reproducible
outcomes at every run:
cv = StratifiedKFold(random_state=42, shuffle=True)
pipe = make_pipeline(
StandardScaler(),
RandomForestClassifier(random_state=42)
)
for i in range(10):
acc = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=cv)
mae = cross_val_score(pipe, data, pd.Categorical(labels).codes, cv=cv,
scoring='neg_mean_absolute_error')
print('Accuracy run {} = {}, MAE run {} = {}, Chance run {} = {}'.format(i, np.round(np.mean(acc), 3),
i, np.round(np.mean(-mae), 3),
i, np.round(1/len(labels.unique()), 3)))
Accuracy run 0 = 0.548, MAE run 0 = 0.955, Chance run 0 = 0.167 Accuracy run 1 = 0.548, MAE run 1 = 0.955, Chance run 1 = 0.167 Accuracy run 2 = 0.548, MAE run 2 = 0.955, Chance run 2 = 0.167 Accuracy run 3 = 0.548, MAE run 3 = 0.955, Chance run 3 = 0.167 Accuracy run 4 = 0.548, MAE run 4 = 0.955, Chance run 4 = 0.167 Accuracy run 5 = 0.548, MAE run 5 = 0.955, Chance run 5 = 0.167 Accuracy run 6 = 0.548, MAE run 6 = 0.955, Chance run 6 = 0.167 Accuracy run 7 = 0.548, MAE run 7 = 0.955, Chance run 7 = 0.167 Accuracy run 8 = 0.548, MAE run 8 = 0.955, Chance run 8 = 0.167 Accuracy run 9 = 0.548, MAE run 9 = 0.955, Chance run 9 = 0.167
Regarding our ANN
, we would need to set the random_state
argument
for the train_test_split
function
to create training
and test sets
in a reproducible
manner:
for i in range(100):
X_train, X_test, y_train, y_test = train_test_split(data, pd.Categorical(labels).codes, test_size=0.2, shuffle=True)
pd.Series(y_train).plot(kind='hist', stacked=True)
for i in range(100):
X_train, X_test, y_train, y_test = train_test_split(data, pd.Categorical(labels).codes,
test_size=0.2, shuffle=True,
random_state=42)
pd.Series(y_train).plot(kind='hist', stacked=True)
Additionally, we would need to set the seed
across various parts of our computational environment
. Without that, we get different results at every run:
for i in range(10):
model = keras.Sequential()
model.add(layers.Input(shape=data[1].shape))
model.add(layers.Dense(100, activation="relu", kernel_initializer='he_normal', bias_initializer='zeros'))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(50, activation="relu"))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(25, activation="relu"))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(len(labels.unique()), activation='softmax'))
model.compile(loss='sparse_categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
fit = model.fit(X_train, y_train, epochs=30, batch_size=20, validation_split=0.2, verbose=0)
score, acc = model.evaluate(X_test, y_test,
batch_size=2, verbose=0)
print('Test score run {}: {} , Test accuracy run {}: {}'.format(i, score, i, acc))
Test score run 0: 1.2065298557281494 , Test accuracy run 0: 0.6129032373428345 Test score run 1: 1.299221158027649 , Test accuracy run 1: 0.5161290168762207 Test score run 2: 1.126314640045166 , Test accuracy run 2: 0.6129032373428345 Test score run 3: 1.611088514328003 , Test accuracy run 3: 0.3870967626571655 Test score run 4: 1.2121350765228271 , Test accuracy run 4: 0.6129032373428345 Test score run 5: 1.1756813526153564 , Test accuracy run 5: 0.5806451439857483 Test score run 6: 1.3319497108459473 , Test accuracy run 6: 0.35483869910240173 Test score run 7: 1.2700525522232056 , Test accuracy run 7: 0.6129032373428345 Test score run 8: 1.2562886476516724 , Test accuracy run 8: 0.6774193644523621 Test score run 9: 1.2847849130630493 , Test accuracy run 9: 0.6129032373428345
After setting the seed
, for example the general random seeds
and the tensorflow seed
, we get the same output every run:
import os, random
for i in range(10):
# Set all seeds at the beginning, before the loop
os.environ['PYTHONHASHSEED'] = str(42)
random.seed(42)
np.random.seed(42)
tf.random.set_seed(42)
model = keras.Sequential()
model.add(layers.Input(shape=data[1].shape))
model.add(layers.Dense(100, activation="relu", kernel_initializer='he_normal', bias_initializer='zeros'))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(50, activation="relu"))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(25, activation="relu"))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(len(labels.unique()), activation='softmax'))
model.compile(loss='sparse_categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
fit = model.fit(X_train, y_train, epochs=30, batch_size=20, validation_split=0.2, verbose=0)
score, acc = model.evaluate(X_test, y_test, batch_size=2, verbose=0)
print('Test score run {}: {} , Test accuracy run {}: {}'.format(i, score, i, acc))
Test score run 0: 1.2624965906143188 , Test accuracy run 0: 0.5806451439857483 Test score run 1: 1.2624965906143188 , Test accuracy run 1: 0.5806451439857483 Test score run 2: 1.2624965906143188 , Test accuracy run 2: 0.5806451439857483 Test score run 3: 1.2624965906143188 , Test accuracy run 3: 0.5806451439857483 Test score run 4: 1.2624965906143188 , Test accuracy run 4: 0.5806451439857483 Test score run 5: 1.2624965906143188 , Test accuracy run 5: 0.5806451439857483 Test score run 6: 1.2624965906143188 , Test accuracy run 6: 0.5806451439857483 Test score run 7: 1.2624965906143188 , Test accuracy run 7: 0.5806451439857483 Test score run 8: 1.2624965906143188 , Test accuracy run 8: 0.5806451439857483 Test score run 9: 1.2624965906143188 , Test accuracy run 9: 0.5806451439857483
Please note, that this appears to be sufficient for our minimal example but more realistic ANN
s would potentially require way more steps to achieve reproducibility
, including:
inter
/intra op parallelism
from tensorflow.python.eager import context
context._context = None
context._create_context()
tf.config.threading.set_inter_op_parallelism_threads(1)
tf.config.threading.set_intra_op_parallelism_threads(1)
determinism
via os
and/or utilizing packages
, like Framework determinism but starting with tensorflow
version 2.8
, you can set it directly:# os.environ['TF_DETERMINISTIC_OPS'] = '1'
# from tfdeterminism import patch
# patch()
tf.config.experimental.enable_op_determinism()
The respective solution will depend on your computational environment
, model architecture
, data
, etc. and might sometimes not even exist (at least within a reasonable implementation).
This ongoing concern itself became a new field of study, with fascinating and very helpful outputs, e.g.
Bouthillier et al. (2021): Accounting for variability in machine learning benchmarks
adapted from [Martina Vilas](https://doi.org/10.5281/zenodo.4740053)</small>
data
: standardization
(BIDS), tracking everything (git/github & datalad), sharing¶data
is crucial & thus everything adjacent to it is crucial as wellstandardization
of data
& models
to facilitate management
, QC
and FAIR
-nessversion control
sharing
data
& models
to maximize reproducibility
& FAIR
-ness, as well as to reduce computational effort
data
& models
¶standardizing
data
& models
, as well as their description is tremendously helpful & important concerning reproducibility
and other aspects, independent of the data
at handbig data
however profits even more from this process as the combination of human
/machine
readability and meta-data
allows to efficiently handle data management
, evaluation
, QC
, augmentation
, etc.One of the best options to tackle this is of course the Brain Imaging Data Structure or BIDS
:
version control
¶machine learning analyses
tend to be very complex and thus a lot can happen along the wayanalysis code
, as well as data input
and output
might change frequentlyreproducibility
is basically not possibleOne commonly know tool to address this is version control
, which itself can take different forms, depending on the precise data
at hand.
Placing everything one does under version control
via these tools
& resources
can help a great deal to achieve reproducibility
, as the respective aspects, e.g. code
& data
are constantly monitored
and changes tracked
.
For example, starting with our machine learning analyses python script
, what if we apply certain changes and the outcomes vary as a result of it? Using version control
, it's "easy" to evaluate what happened and restore previous versions!
We actually adapted our python script
so that the analyses
are more reproducible
. With git
, we can see that:
While code version control
using git
and GitHub
is fortunately more and more common, version control
of other forms of data
, e.g. datasets
is not.
For example, what happens when we run multiple machine learning analyses
on the same dataset
that yield different outcomes? How do we keep track of that?
One up and coming tool
that allows to "easily" track
changes in all kinds of data
, is DataLad
.
With it, one can create version controlled
datasets
of basically any form and monitor every change that appears along the way, including input
s and output
s, files
in general, utilized software
, etc. .
Applied to our machine learning analyses
, this could look as follows:
We create a new dataset
:
%%bash
datalad create repronim_ml_showcase
create(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase (dataset)
And download
the data
, installing
it as dataset
:
%%bash
cd repronim_ml_showcase
datalad download-url \
--archive \
--message "Download dataset" \
'https://www.dropbox.com/s/v48f8pjfw4u2bxi/MAIN_BASC064_subsamp_features.npz?dl=1'
[INFO] Downloading 'https://www.dropbox.com/s/v48f8pjfw4u2bxi/MAIN_BASC064_subsamp_features.npz?dl=1' into '/Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/' [INFO] Adding content of the archive /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/MAIN_BASC064_subsamp_features.npz into annex AnnexRepo(/Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase) [INFO] Initializing special remote datalad-archives [INFO] Extracting archive [INFO] Finished adding /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/MAIN_BASC064_subsamp_features.npz: Files processed: 1, +annex: 1 [INFO] Finished extraction
download_url(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/MAIN_BASC064_subsamp_features.npz (file) add(ok): MAIN_BASC064_subsamp_features.npz (file) save(ok): . (dataset) add-archive-content(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase (dataset) action summary: add (ok: 1) add-archive-content (ok: 1) download_url (ok: 1) save (ok: 1)
%%bash
cd repronim_ml_showcase
datalad download-url \
--message "Download labels" \
'https://www.dropbox.com/s/ofsqdcukyde4lke/participants.csv?dl=1'
[INFO] Downloading 'https://www.dropbox.com/s/ofsqdcukyde4lke/participants.csv?dl=1' into '/Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/'
download_url(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/participants.csv (file) add(ok): participants.csv (file) save(ok): . (dataset) action summary: add (ok: 1) download_url (ok: 1) save (ok: 1)
We then can use DataLad
's YODA principles to create a directory structure for our project, including folders for data
and code
:
%%bash
cd repronim_ml_showcase
datalad create -c text2git -c yoda ml_showcase
[INFO] Running procedure cfg_text2git [INFO] == Command start (output follows) ===== [INFO] == Command exit (modification check follows) ===== [INFO] Running procedure cfg_yoda [INFO] == Command start (output follows) ===== [INFO] == Command exit (modification check follows) =====
run(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase (dataset) [/Users/peerherholz/anaconda3/envs/repron...] run(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase (dataset) [/Users/peerherholz/anaconda3/envs/repron...] create(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase (dataset) action summary: create (ok: 1) run (ok: 2)
and subsequently clone
the installed dataset
s into the data
folder, indicating that this is our raw data
:
%%bash
cd repronim_ml_showcase/ml_showcase
mkdir -p data
datalad clone -d . ../ data/raw
[INFO] Attempting a clone into /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase/data/raw [INFO] Attempting to clone from ../ to /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase/data/raw [INFO] Completed clone attempts for Dataset(/Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase/data/raw)
install(ok): data/raw (dataset) add(ok): data/raw (dataset) add(ok): .gitmodules (file) save(ok): . (dataset) add(ok): .gitmodules (file) save(ok): . (dataset) action summary: add (ok: 3) install (ok: 1) save (ok: 2)
To make things as reproducible
as possible, we also add our software container
from above to the dataset
, so that the computational environment
and it's application is also version control
led:
%%bash
cd repronim_ml_showcase/ml_showcase
datalad containers-add repronim-ml-container --url dhub://peerherholz/repronim_ml:0.2
0.2: Pulling from peerherholz/repronim_ml Digest: sha256:6fc095cd8c0904383bdc9d76610d6c366010fa7f7db27a1ea14b3babe10912f2 Status: Image is up to date for peerherholz/repronim_ml:0.2 docker.io/peerherholz/repronim_ml:0.2
[INFO] Saved peerherholz/repronim_ml:0.2 to /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase/.datalad/environments/repronim-ml-container/image
add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/0a38d0bfe80294b873773cbc3acea6946a27f55386fc9330e249fb9835a66ac9 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/0e50ef12a510a65beb20b5122fba717d6a41036693327c541b6f94e9c2bf2262 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/0ebe55b09a47ee25a918f3dab37487c865e917e44fa4ec1f4cdf9ef5d35bed51 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/35867f73b6dfe02ae68121d8b3cf79d057c521d9f6e3a6eca9a679e3d2615947 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/41a6e8d17b82379404f6b5d8d57bbb895bb8a33bdf6f3a0158252b6d1e152e31 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/4eb5769c3f8c38670f47db5a2f2e465015ca96117427ce3f498c708d1735774a (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/548a79621a426b4eb077c926eabac5a8620c454fb230640253e1b44dc7dd7562 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/73d18f5763845afd16ca5518619868c41e17e385767564de608e6de0a68b73c5 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/7e042413f64fbaec731170124ec60966e747653aa61bcb63f4cd97f63802d2f6 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/7fb5211e47790702bab7f8aff0cc6738a66540bec1eda2dc769ee6b0a3340b59 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/869bd1207d4ce6116daefc994fdb4a55db3f07623e90b400d67a6e6767e442b0 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/a4d4d6c1c9d105d46e0817ed80344e26aad2765e0c6a23a6ed1c402bd4da3a60 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/c9883af7a17f40f6797b8e900ccca44a5954ae6dd55b4dae2a29cf4ebb80c602 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/d75e032abac421b8987824aae0019efcc63580dc7160d287978d47e6c1afb222 (file) add(ok): .datalad/environments/repronim-ml-container/image/index.json (file) add(ok): .datalad/environments/repronim-ml-container/image/manifest.json (file) add(ok): .datalad/environments/repronim-ml-container/image/oci-layout (file) add(ok): .datalad/environments/repronim-ml-container/image/repositories (file) add(ok): .datalad/config (file) save(ok): . (dataset) action summary: add (ok: 19) save (ok: 1) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/0a38d0bfe80294b873773cbc3acea6946a27f55386fc9330e249fb9835a66ac9 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/0e50ef12a510a65beb20b5122fba717d6a41036693327c541b6f94e9c2bf2262 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/0ebe55b09a47ee25a918f3dab37487c865e917e44fa4ec1f4cdf9ef5d35bed51 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/35867f73b6dfe02ae68121d8b3cf79d057c521d9f6e3a6eca9a679e3d2615947 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/41a6e8d17b82379404f6b5d8d57bbb895bb8a33bdf6f3a0158252b6d1e152e31 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/4eb5769c3f8c38670f47db5a2f2e465015ca96117427ce3f498c708d1735774a (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/548a79621a426b4eb077c926eabac5a8620c454fb230640253e1b44dc7dd7562 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/73d18f5763845afd16ca5518619868c41e17e385767564de608e6de0a68b73c5 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/7e042413f64fbaec731170124ec60966e747653aa61bcb63f4cd97f63802d2f6 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/7fb5211e47790702bab7f8aff0cc6738a66540bec1eda2dc769ee6b0a3340b59 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/869bd1207d4ce6116daefc994fdb4a55db3f07623e90b400d67a6e6767e442b0 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/a4d4d6c1c9d105d46e0817ed80344e26aad2765e0c6a23a6ed1c402bd4da3a60 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/c9883af7a17f40f6797b8e900ccca44a5954ae6dd55b4dae2a29cf4ebb80c602 (file) add(ok): .datalad/environments/repronim-ml-container/image/blobs/sha256/d75e032abac421b8987824aae0019efcc63580dc7160d287978d47e6c1afb222 (file) add(ok): .datalad/environments/repronim-ml-container/image/index.json (file) add(ok): .datalad/environments/repronim-ml-container/image/manifest.json (file) add(ok): .datalad/environments/repronim-ml-container/image/oci-layout (file) add(ok): .datalad/environments/repronim-ml-container/image/repositories (file) add(ok): .datalad/config (file) save(ok): . (dataset) containers_add(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase/.datalad/environments/repronim-ml-container/image (file) action summary: add (ok: 19) containers_add (ok: 1) save (ok: 1)
The only thing that's missing is the python script
that performs the machine learning analyses
, which can simply be added to the dataset
as well.
%%bash
cd repronim_ml_showcase/ml_showcase/
cp ../../code/ml_reproducibility.py code/
datalad save -m "add random forest & ANN script" code/ml_reproducibility.py
add(ok): code/ml_reproducibility.py (file) save(ok): . (dataset) action summary: add (ok: 1) save (ok: 1)
With that, we already have everything in place to run a (fully) reproducible
machine learning analysis
, that track
s the dataset
s, the computational environment
, the code
and also their interaction via monitoring analyses
input
s and output
s:
%%bash
cd repronim_ml_showcase/ml_showcase/
datalad containers-run -n repronim-ml-container \
-m "First run of ML analyses" \
--input 'data/raw/' \
--output 'metrics.json' \
--output 'random_forest.joblib' \
--output 'ANN.h5' \
"code/ml_reproducibility.py"
[INFO] Making sure inputs are available (this may take some time) [INFO] == Command start (output follows) ===== whoami: cannot find name for user ID 501 2025-06-17 10:05:22.821481: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2025-06-17 10:05:23.076039: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2025-06-17 10:05:23.323356: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:479] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2025-06-17 10:05:23.517340: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:10575] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2025-06-17 10:05:23.520665: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1442] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2025-06-17 10:05:23.866079: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2025-06-17 10:05:27.561405: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT mkdir -p failed for path /.config/matplotlib: [Errno 13] Permission denied: '/.config' Matplotlib created a temporary cache directory at /tmp/matplotlib-halwfd9k because there was an issue with the default path (/.config/matplotlib); it is highly recommended to set the MPLCONFIGDIR environment variable to a writable directory, in particular to speed up the import of Matplotlib and to better support multiprocessing. /opt/miniconda-latest/envs/repronim_ml/lib/python3.11/site-packages/keras/src/layers/core/dense.py:93: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead. super().__init__(activity_regularizer=activity_regularizer, **kwargs)
Epoch 1/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 4s 135ms/step - accuracy: 0.1682 - loss: 3.1353 - val_accuracy: 0.1600 - val_loss: 1.7578 Epoch 2/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.2005 - loss: 2.9439 - val_accuracy: 0.2000 - val_loss: 1.7421 Epoch 3/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.1965 - loss: 2.7430 - val_accuracy: 0.2000 - val_loss: 1.7247 Epoch 4/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.2090 - loss: 2.4735 - val_accuracy: 0.2400 - val_loss: 1.7127 Epoch 5/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.3182 - loss: 1.9886 - val_accuracy: 0.2400 - val_loss: 1.7089 Epoch 6/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.2426 - loss: 2.4080 - val_accuracy: 0.2400 - val_loss: 1.7039 Epoch 7/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.3384 - loss: 1.9718 - val_accuracy: 0.3200 - val_loss: 1.6989 Epoch 8/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.3158 - loss: 1.8423 - val_accuracy: 0.2800 - val_loss: 1.6928 Epoch 9/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.3415 - loss: 1.8677 - val_accuracy: 0.2800 - val_loss: 1.6891 Epoch 10/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.4512 - loss: 1.8030 - val_accuracy: 0.2800 - val_loss: 1.6753 Epoch 11/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.3817 - loss: 1.6049 - val_accuracy: 0.3200 - val_loss: 1.6508 Epoch 12/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.3516 - loss: 1.6553 - val_accuracy: 0.3600 - val_loss: 1.6404 Epoch 13/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.4239 - loss: 1.5727 - val_accuracy: 0.3200 - val_loss: 1.6322 Epoch 14/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.3851 - loss: 1.7561 - val_accuracy: 0.3600 - val_loss: 1.6305 Epoch 15/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.3682 - loss: 1.5350 - val_accuracy: 0.3600 - val_loss: 1.6206 Epoch 16/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.4902 - loss: 1.5471 - val_accuracy: 0.3600 - val_loss: 1.6122 Epoch 17/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5094 - loss: 1.3340 - val_accuracy: 0.3200 - val_loss: 1.6052 Epoch 18/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.4912 - loss: 1.5595 - val_accuracy: 0.4000 - val_loss: 1.6026 Epoch 19/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.3739 - loss: 1.7639 - val_accuracy: 0.3600 - val_loss: 1.6006 Epoch 20/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.5013 - loss: 1.1987 - val_accuracy: 0.4000 - val_loss: 1.5935 Epoch 21/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 0.5331 - loss: 1.2782 - val_accuracy: 0.4400 - val_loss: 1.5842 Epoch 22/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 0.5036 - loss: 1.3426 - val_accuracy: 0.4400 - val_loss: 1.5732 Epoch 23/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 0.4701 - loss: 1.3816 - val_accuracy: 0.4000 - val_loss: 1.5530 Epoch 24/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.5211 - loss: 1.1790 - val_accuracy: 0.4000 - val_loss: 1.5334 Epoch 25/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.5412 - loss: 1.1271 - val_accuracy: 0.4400 - val_loss: 1.5282 Epoch 26/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 1s 108ms/step - accuracy: 0.5600 - loss: 1.0860 - val_accuracy: 0.4400 - val_loss: 1.5170 Epoch 27/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 64ms/step - accuracy: 0.5290 - loss: 1.1954 - val_accuracy: 0.4000 - val_loss: 1.5044 Epoch 28/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.5509 - loss: 1.4474 - val_accuracy: 0.4000 - val_loss: 1.4995 Epoch 29/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.5314 - loss: 1.2312 - val_accuracy: 0.3600 - val_loss: 1.4926 Epoch 30/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 64ms/step - accuracy: 0.5755 - loss: 1.1798 - val_accuracy: 0.4000 - val_loss: 1.4849 Epoch 31/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 0.5441 - loss: 1.1606 - val_accuracy: 0.4000 - val_loss: 1.4548 Epoch 32/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 0.5601 - loss: 1.1150 - val_accuracy: 0.3600 - val_loss: 1.4320 Epoch 33/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.7056 - loss: 0.8754 - val_accuracy: 0.4000 - val_loss: 1.4215 Epoch 34/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.6018 - loss: 1.0017 - val_accuracy: 0.4000 - val_loss: 1.4194 Epoch 35/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.5933 - loss: 1.0377 - val_accuracy: 0.4000 - val_loss: 1.4287 Epoch 36/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.6075 - loss: 1.0813 - val_accuracy: 0.4000 - val_loss: 1.4367 Epoch 37/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.6903 - loss: 0.9335 - val_accuracy: 0.4000 - val_loss: 1.4264 Epoch 38/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.6064 - loss: 0.9795 - val_accuracy: 0.4000 - val_loss: 1.4144 Epoch 39/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.6535 - loss: 0.9267 - val_accuracy: 0.4400 - val_loss: 1.3945 Epoch 40/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.6618 - loss: 0.8390 - val_accuracy: 0.4400 - val_loss: 1.3868 Epoch 41/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.5931 - loss: 1.1217 - val_accuracy: 0.4400 - val_loss: 1.3742 Epoch 42/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.6806 - loss: 0.9722 - val_accuracy: 0.4800 - val_loss: 1.3661 Epoch 43/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.6320 - loss: 1.0241 - val_accuracy: 0.4800 - val_loss: 1.3652 Epoch 44/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.7433 - loss: 0.8491 - val_accuracy: 0.4800 - val_loss: 1.3620 Epoch 45/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.6714 - loss: 0.9081 - val_accuracy: 0.4800 - val_loss: 1.3532 Epoch 46/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.5939 - loss: 1.1492 - val_accuracy: 0.4800 - val_loss: 1.3508 Epoch 47/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7482 - loss: 0.7812 - val_accuracy: 0.4400 - val_loss: 1.3566 Epoch 48/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7226 - loss: 0.8208 - val_accuracy: 0.4400 - val_loss: 1.3613 Epoch 49/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.7016 - loss: 0.8025 - val_accuracy: 0.4000 - val_loss: 1.3775 Epoch 50/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.7320 - loss: 0.8339 - val_accuracy: 0.4000 - val_loss: 1.4084 Epoch 51/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7005 - loss: 0.8364 - val_accuracy: 0.4000 - val_loss: 1.4357 Epoch 52/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.6762 - loss: 0.7686 - val_accuracy: 0.4000 - val_loss: 1.4461 Epoch 53/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7137 - loss: 0.7955 - val_accuracy: 0.4000 - val_loss: 1.4553 Epoch 54/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.7556 - loss: 0.8180 - val_accuracy: 0.4000 - val_loss: 1.4502 Epoch 55/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.7654 - loss: 0.6045 - val_accuracy: 0.4400 - val_loss: 1.4462 Epoch 56/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8291 - loss: 0.6320 - val_accuracy: 0.4400 - val_loss: 1.4435 Epoch 57/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7199 - loss: 0.7511 - val_accuracy: 0.4400 - val_loss: 1.4312 Epoch 58/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.7033 - loss: 0.8254 - val_accuracy: 0.4400 - val_loss: 1.4071 Epoch 59/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.7695 - loss: 0.6561 - val_accuracy: 0.4400 - val_loss: 1.3785 Epoch 60/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 1s 162ms/step - accuracy: 0.7870 - loss: 0.6296 - val_accuracy: 0.4400 - val_loss: 1.3763 Epoch 61/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.7481 - loss: 0.6839 - val_accuracy: 0.4400 - val_loss: 1.3676 Epoch 62/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 1s 57ms/step - accuracy: 0.7967 - loss: 0.5983 - val_accuracy: 0.4800 - val_loss: 1.3707 Epoch 63/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.7455 - loss: 0.6909 - val_accuracy: 0.4800 - val_loss: 1.3741 Epoch 64/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.8012 - loss: 0.6807 - val_accuracy: 0.4400 - val_loss: 1.3727 Epoch 65/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8234 - loss: 0.6263 - val_accuracy: 0.4400 - val_loss: 1.3652 Epoch 66/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8446 - loss: 0.6406 - val_accuracy: 0.4800 - val_loss: 1.3648 Epoch 67/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 0.7894 - loss: 0.6237 - val_accuracy: 0.4800 - val_loss: 1.3598 Epoch 68/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.7871 - loss: 0.6525 - val_accuracy: 0.4800 - val_loss: 1.3772 Epoch 69/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.7192 - loss: 0.6362 - val_accuracy: 0.4800 - val_loss: 1.3707 Epoch 70/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.7759 - loss: 0.6441 - val_accuracy: 0.4400 - val_loss: 1.3630 Epoch 71/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7707 - loss: 0.6915 - val_accuracy: 0.4400 - val_loss: 1.3684 Epoch 72/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.7626 - loss: 0.6144 - val_accuracy: 0.4400 - val_loss: 1.3707 Epoch 73/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.7994 - loss: 0.6160 - val_accuracy: 0.5200 - val_loss: 1.3724 Epoch 74/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.8162 - loss: 0.5677 - val_accuracy: 0.5200 - val_loss: 1.3842 Epoch 75/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 0.7634 - loss: 0.6128 - val_accuracy: 0.5200 - val_loss: 1.3903 Epoch 76/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 0.8559 - loss: 0.5351 - val_accuracy: 0.4800 - val_loss: 1.3947 Epoch 77/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.8635 - loss: 0.5372 - val_accuracy: 0.4800 - val_loss: 1.4054 Epoch 78/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8022 - loss: 0.5249 - val_accuracy: 0.4400 - val_loss: 1.4068 Epoch 79/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.7984 - loss: 0.6246 - val_accuracy: 0.4400 - val_loss: 1.3882 Epoch 80/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8045 - loss: 0.6031 - val_accuracy: 0.4400 - val_loss: 1.3914 Epoch 81/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.7959 - loss: 0.5813 - val_accuracy: 0.4400 - val_loss: 1.3988 Epoch 82/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8330 - loss: 0.5760 - val_accuracy: 0.4400 - val_loss: 1.4185 Epoch 83/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8401 - loss: 0.5935 - val_accuracy: 0.4000 - val_loss: 1.4369 Epoch 84/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8221 - loss: 0.4579 - val_accuracy: 0.4000 - val_loss: 1.4545 Epoch 85/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.8813 - loss: 0.4539 - val_accuracy: 0.4800 - val_loss: 1.4667 Epoch 86/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8930 - loss: 0.3409 - val_accuracy: 0.5200 - val_loss: 1.4703 Epoch 87/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.8191 - loss: 0.5205 - val_accuracy: 0.5600 - val_loss: 1.4694 Epoch 88/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8657 - loss: 0.4383 - val_accuracy: 0.5600 - val_loss: 1.4628 Epoch 89/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8451 - loss: 0.5128 - val_accuracy: 0.5200 - val_loss: 1.4436 Epoch 90/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 0.8497 - loss: 0.4191 - val_accuracy: 0.4800 - val_loss: 1.4380 Epoch 91/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.7938 - loss: 0.5034 - val_accuracy: 0.5200 - val_loss: 1.4310 Epoch 92/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.8123 - loss: 0.5140 - val_accuracy: 0.5600 - val_loss: 1.4223 Epoch 93/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.8127 - loss: 0.6025 - val_accuracy: 0.5200 - val_loss: 1.4278 Epoch 94/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8427 - loss: 0.3996 - val_accuracy: 0.5200 - val_loss: 1.4311 Epoch 95/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.8847 - loss: 0.4000 - val_accuracy: 0.5600 - val_loss: 1.4091 Epoch 96/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9227 - loss: 0.3516 - val_accuracy: 0.5200 - val_loss: 1.3729 Epoch 97/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8636 - loss: 0.4561 - val_accuracy: 0.5600 - val_loss: 1.3430 Epoch 98/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8509 - loss: 0.4467 - val_accuracy: 0.5600 - val_loss: 1.3440 Epoch 99/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8350 - loss: 0.4824 - val_accuracy: 0.5600 - val_loss: 1.3528 Epoch 100/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9037 - loss: 0.3843 - val_accuracy: 0.5600 - val_loss: 1.3508 Epoch 101/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8866 - loss: 0.3577 - val_accuracy: 0.5600 - val_loss: 1.3487 Epoch 102/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9031 - loss: 0.3693 - val_accuracy: 0.5200 - val_loss: 1.3680 Epoch 103/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8684 - loss: 0.4625 - val_accuracy: 0.5200 - val_loss: 1.3880 Epoch 104/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9286 - loss: 0.3267 - val_accuracy: 0.4800 - val_loss: 1.3967 Epoch 105/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8628 - loss: 0.4724 - val_accuracy: 0.4400 - val_loss: 1.4042 Epoch 106/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.7921 - loss: 0.5085 - val_accuracy: 0.5200 - val_loss: 1.3996 Epoch 107/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9617 - loss: 0.3162 - val_accuracy: 0.5200 - val_loss: 1.3962 Epoch 108/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8653 - loss: 0.3951 - val_accuracy: 0.5200 - val_loss: 1.3851 Epoch 109/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9519 - loss: 0.3160 - val_accuracy: 0.5200 - val_loss: 1.3953 Epoch 110/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9395 - loss: 0.2870 - val_accuracy: 0.5200 - val_loss: 1.4044 Epoch 111/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.8633 - loss: 0.4702 - val_accuracy: 0.5200 - val_loss: 1.4095 Epoch 112/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9076 - loss: 0.3220 - val_accuracy: 0.5200 - val_loss: 1.4114 Epoch 113/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9416 - loss: 0.3107 - val_accuracy: 0.5200 - val_loss: 1.4087 Epoch 114/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9121 - loss: 0.3656 - val_accuracy: 0.5200 - val_loss: 1.4265 Epoch 115/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9292 - loss: 0.3068 - val_accuracy: 0.5600 - val_loss: 1.4402 Epoch 116/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9109 - loss: 0.2903 - val_accuracy: 0.5600 - val_loss: 1.4421 Epoch 117/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9473 - loss: 0.2135 - val_accuracy: 0.5600 - val_loss: 1.4479 Epoch 118/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9432 - loss: 0.2962 - val_accuracy: 0.5600 - val_loss: 1.4484 Epoch 119/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9199 - loss: 0.2740 - val_accuracy: 0.5600 - val_loss: 1.4531 Epoch 120/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.8376 - loss: 0.4256 - val_accuracy: 0.5600 - val_loss: 1.4615 Epoch 121/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8548 - loss: 0.3672 - val_accuracy: 0.5600 - val_loss: 1.4791 Epoch 122/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9335 - loss: 0.2492 - val_accuracy: 0.5600 - val_loss: 1.5080 Epoch 123/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9125 - loss: 0.3021 - val_accuracy: 0.5600 - val_loss: 1.5161 Epoch 124/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9272 - loss: 0.3089 - val_accuracy: 0.5600 - val_loss: 1.4875 Epoch 125/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8879 - loss: 0.3350 - val_accuracy: 0.5600 - val_loss: 1.4810 Epoch 126/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9453 - loss: 0.2478 - val_accuracy: 0.6000 - val_loss: 1.5065 Epoch 127/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9067 - loss: 0.3434 - val_accuracy: 0.5600 - val_loss: 1.5503 Epoch 128/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9741 - loss: 0.2326 - val_accuracy: 0.5600 - val_loss: 1.5893 Epoch 129/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9263 - loss: 0.2967 - val_accuracy: 0.5600 - val_loss: 1.6183 Epoch 130/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8853 - loss: 0.3330 - val_accuracy: 0.5600 - val_loss: 1.6401 Epoch 131/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9230 - loss: 0.2707 - val_accuracy: 0.5600 - val_loss: 1.6607 Epoch 132/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9590 - loss: 0.2739 - val_accuracy: 0.5600 - val_loss: 1.6751 Epoch 133/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.8976 - loss: 0.2240 - val_accuracy: 0.5200 - val_loss: 1.6823 Epoch 134/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8217 - loss: 0.4352 - val_accuracy: 0.5200 - val_loss: 1.6856 Epoch 135/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.9017 - loss: 0.3107 - val_accuracy: 0.5200 - val_loss: 1.6879 Epoch 136/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9012 - loss: 0.3038 - val_accuracy: 0.5200 - val_loss: 1.7095 Epoch 137/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9672 - loss: 0.2260 - val_accuracy: 0.4800 - val_loss: 1.7371 Epoch 138/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9747 - loss: 0.2218 - val_accuracy: 0.4400 - val_loss: 1.7528 Epoch 139/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9114 - loss: 0.3196 - val_accuracy: 0.4400 - val_loss: 1.7737 Epoch 140/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9366 - loss: 0.2356 - val_accuracy: 0.4400 - val_loss: 1.7818 Epoch 141/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.8721 - loss: 0.3389 - val_accuracy: 0.4400 - val_loss: 1.7636 Epoch 142/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9594 - loss: 0.2210 - val_accuracy: 0.4400 - val_loss: 1.7596 Epoch 143/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9466 - loss: 0.2648 - val_accuracy: 0.4000 - val_loss: 1.7570 Epoch 144/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9492 - loss: 0.2704 - val_accuracy: 0.4000 - val_loss: 1.7824 Epoch 145/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.8825 - loss: 0.2816 - val_accuracy: 0.4000 - val_loss: 1.7971 Epoch 146/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9073 - loss: 0.3248 - val_accuracy: 0.4000 - val_loss: 1.7860 Epoch 147/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.8970 - loss: 0.2725 - val_accuracy: 0.4000 - val_loss: 1.7618 Epoch 148/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.8860 - loss: 0.3315 - val_accuracy: 0.4000 - val_loss: 1.7436 Epoch 149/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9604 - loss: 0.2108 - val_accuracy: 0.4400 - val_loss: 1.7253 Epoch 150/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9513 - loss: 0.2220 - val_accuracy: 0.4400 - val_loss: 1.7156 Epoch 151/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9114 - loss: 0.2938 - val_accuracy: 0.4400 - val_loss: 1.6996 Epoch 152/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9395 - loss: 0.2412 - val_accuracy: 0.4000 - val_loss: 1.7315 Epoch 153/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9355 - loss: 0.2394 - val_accuracy: 0.4000 - val_loss: 1.7532 Epoch 154/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9693 - loss: 0.1798 - val_accuracy: 0.4400 - val_loss: 1.7673 Epoch 155/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9436 - loss: 0.2258 - val_accuracy: 0.4000 - val_loss: 1.7779 Epoch 156/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9569 - loss: 0.2305 - val_accuracy: 0.4400 - val_loss: 1.7794 Epoch 157/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9154 - loss: 0.2359 - val_accuracy: 0.4800 - val_loss: 1.7817 Epoch 158/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9532 - loss: 0.1770 - val_accuracy: 0.4400 - val_loss: 1.7789 Epoch 159/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9596 - loss: 0.1705 - val_accuracy: 0.4400 - val_loss: 1.7752 Epoch 160/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9677 - loss: 0.2104 - val_accuracy: 0.4000 - val_loss: 1.7733 Epoch 161/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9231 - loss: 0.2743 - val_accuracy: 0.4800 - val_loss: 1.7401 Epoch 162/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9191 - loss: 0.2555 - val_accuracy: 0.4800 - val_loss: 1.7156 Epoch 163/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9491 - loss: 0.1929 - val_accuracy: 0.4800 - val_loss: 1.7147 Epoch 164/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9499 - loss: 0.2054 - val_accuracy: 0.5200 - val_loss: 1.7278 Epoch 165/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9037 - loss: 0.2433 - val_accuracy: 0.4800 - val_loss: 1.7229 Epoch 166/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 0.9561 - loss: 0.2309 - val_accuracy: 0.4800 - val_loss: 1.7322 Epoch 167/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.9562 - loss: 0.1805 - val_accuracy: 0.4800 - val_loss: 1.7309 Epoch 168/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9108 - loss: 0.2556 - val_accuracy: 0.4800 - val_loss: 1.7441 Epoch 169/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9506 - loss: 0.2279 - val_accuracy: 0.4800 - val_loss: 1.7353 Epoch 170/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9128 - loss: 0.2868 - val_accuracy: 0.4400 - val_loss: 1.7180 Epoch 171/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9803 - loss: 0.2242 - val_accuracy: 0.4400 - val_loss: 1.6997 Epoch 172/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9495 - loss: 0.2058 - val_accuracy: 0.4400 - val_loss: 1.6965 Epoch 173/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9438 - loss: 0.1777 - val_accuracy: 0.4400 - val_loss: 1.6893 Epoch 174/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9884 - loss: 0.1455 - val_accuracy: 0.4400 - val_loss: 1.7074 Epoch 175/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9292 - loss: 0.2527 - val_accuracy: 0.4400 - val_loss: 1.7490 Epoch 176/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9555 - loss: 0.2185 - val_accuracy: 0.4400 - val_loss: 1.7617 Epoch 177/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9338 - loss: 0.2598 - val_accuracy: 0.4400 - val_loss: 1.7649 Epoch 178/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9188 - loss: 0.2335 - val_accuracy: 0.4800 - val_loss: 1.7374 Epoch 179/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9222 - loss: 0.3004 - val_accuracy: 0.4800 - val_loss: 1.7207 Epoch 180/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9463 - loss: 0.2106 - val_accuracy: 0.5200 - val_loss: 1.7087 Epoch 181/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9651 - loss: 0.1502 - val_accuracy: 0.5200 - val_loss: 1.7053 Epoch 182/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9210 - loss: 0.2235 - val_accuracy: 0.5200 - val_loss: 1.6973 Epoch 183/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9196 - loss: 0.1936 - val_accuracy: 0.5200 - val_loss: 1.6886 Epoch 184/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9409 - loss: 0.2254 - val_accuracy: 0.5200 - val_loss: 1.6655 Epoch 185/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 31ms/step - accuracy: 0.9876 - loss: 0.1707 - val_accuracy: 0.5200 - val_loss: 1.6473 Epoch 186/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 0.9669 - loss: 0.1621 - val_accuracy: 0.5200 - val_loss: 1.6206 Epoch 187/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9511 - loss: 0.1728 - val_accuracy: 0.5200 - val_loss: 1.6003 Epoch 188/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9477 - loss: 0.2683 - val_accuracy: 0.5200 - val_loss: 1.6101 Epoch 189/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9678 - loss: 0.1462 - val_accuracy: 0.4400 - val_loss: 1.6247 Epoch 190/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9532 - loss: 0.1809 - val_accuracy: 0.4400 - val_loss: 1.6527 Epoch 191/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9863 - loss: 0.1381 - val_accuracy: 0.4400 - val_loss: 1.6660 Epoch 192/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.9590 - loss: 0.1500 - val_accuracy: 0.4400 - val_loss: 1.6612 Epoch 193/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 1s 96ms/step - accuracy: 0.9479 - loss: 0.2039 - val_accuracy: 0.4800 - val_loss: 1.6578 Epoch 194/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9614 - loss: 0.1535 - val_accuracy: 0.5600 - val_loss: 1.6446 Epoch 195/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9575 - loss: 0.1891 - val_accuracy: 0.5600 - val_loss: 1.6032 Epoch 196/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9718 - loss: 0.1692 - val_accuracy: 0.5200 - val_loss: 1.6021 Epoch 197/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9423 - loss: 0.1646 - val_accuracy: 0.5200 - val_loss: 1.6133 Epoch 198/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9836 - loss: 0.1705 - val_accuracy: 0.5200 - val_loss: 1.6002 Epoch 199/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9479 - loss: 0.2228 - val_accuracy: 0.4800 - val_loss: 1.6044 Epoch 200/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9711 - loss: 0.1249 - val_accuracy: 0.4800 - val_loss: 1.6158 Epoch 201/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9540 - loss: 0.1728 - val_accuracy: 0.4800 - val_loss: 1.6074 Epoch 202/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9933 - loss: 0.0841 - val_accuracy: 0.4800 - val_loss: 1.5881 Epoch 203/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9726 - loss: 0.1865 - val_accuracy: 0.4800 - val_loss: 1.6038 Epoch 204/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9553 - loss: 0.1842 - val_accuracy: 0.4800 - val_loss: 1.5920 Epoch 205/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9493 - loss: 0.1863 - val_accuracy: 0.4800 - val_loss: 1.5847 Epoch 206/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9402 - loss: 0.1633 - val_accuracy: 0.4800 - val_loss: 1.5903 Epoch 207/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9622 - loss: 0.1467 - val_accuracy: 0.4800 - val_loss: 1.6174 Epoch 208/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9698 - loss: 0.1341 - val_accuracy: 0.4800 - val_loss: 1.6443 Epoch 209/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9504 - loss: 0.1336 - val_accuracy: 0.5200 - val_loss: 1.6680 Epoch 210/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9857 - loss: 0.1032 - val_accuracy: 0.5200 - val_loss: 1.6796 Epoch 211/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9291 - loss: 0.1857 - val_accuracy: 0.5200 - val_loss: 1.6872 Epoch 212/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9590 - loss: 0.1716 - val_accuracy: 0.5200 - val_loss: 1.7035 Epoch 213/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 0.9760 - loss: 0.1141 - val_accuracy: 0.5200 - val_loss: 1.7204 Epoch 214/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9284 - loss: 0.2288 - val_accuracy: 0.5200 - val_loss: 1.7312 Epoch 215/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 0.9527 - loss: 0.1930 - val_accuracy: 0.5200 - val_loss: 1.7652 Epoch 216/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9738 - loss: 0.1549 - val_accuracy: 0.5200 - val_loss: 1.8075 Epoch 217/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9699 - loss: 0.1520 - val_accuracy: 0.5200 - val_loss: 1.8445 Epoch 218/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 0.9945 - loss: 0.1088 - val_accuracy: 0.4800 - val_loss: 1.8559 Epoch 219/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 0.9535 - loss: 0.1899 - val_accuracy: 0.4800 - val_loss: 1.8550 Epoch 220/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.9788 - loss: 0.1247 - val_accuracy: 0.4800 - val_loss: 1.8441 Epoch 221/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.9739 - loss: 0.1227 - val_accuracy: 0.4800 - val_loss: 1.8228 Epoch 222/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.9835 - loss: 0.1408 - val_accuracy: 0.4800 - val_loss: 1.8083 Epoch 223/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 0.9698 - loss: 0.1506 - val_accuracy: 0.4800 - val_loss: 1.7870 Epoch 224/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 0.9759 - loss: 0.1435 - val_accuracy: 0.4800 - val_loss: 1.7989 Epoch 225/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9678 - loss: 0.1286 - val_accuracy: 0.4800 - val_loss: 1.8306 Epoch 226/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9449 - loss: 0.1851 - val_accuracy: 0.4800 - val_loss: 1.8477 Epoch 227/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9760 - loss: 0.1054 - val_accuracy: 0.4800 - val_loss: 1.8515 Epoch 228/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9747 - loss: 0.1880 - val_accuracy: 0.4800 - val_loss: 1.8326 Epoch 229/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9918 - loss: 0.1185 - val_accuracy: 0.4800 - val_loss: 1.8428 Epoch 230/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9739 - loss: 0.1384 - val_accuracy: 0.4800 - val_loss: 1.8525 Epoch 231/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.9664 - loss: 0.1139 - val_accuracy: 0.4800 - val_loss: 1.8439 Epoch 232/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.9753 - loss: 0.1252 - val_accuracy: 0.4800 - val_loss: 1.8617 Epoch 233/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9256 - loss: 0.1978 - val_accuracy: 0.4800 - val_loss: 1.8797 Epoch 234/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 0.9554 - loss: 0.1903 - val_accuracy: 0.4800 - val_loss: 1.8780 Epoch 235/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9574 - loss: 0.2129 - val_accuracy: 0.4800 - val_loss: 1.8709 Epoch 236/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9325 - loss: 0.1728 - val_accuracy: 0.4800 - val_loss: 1.8570 Epoch 237/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9458 - loss: 0.1701 - val_accuracy: 0.4800 - val_loss: 1.9068 Epoch 238/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 64ms/step - accuracy: 0.9463 - loss: 0.1629 - val_accuracy: 0.5200 - val_loss: 1.9738 Epoch 239/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.9711 - loss: 0.1547 - val_accuracy: 0.5200 - val_loss: 2.0086 Epoch 240/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9739 - loss: 0.0884 - val_accuracy: 0.5200 - val_loss: 2.0243 Epoch 241/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9307 - loss: 0.1824 - val_accuracy: 0.5200 - val_loss: 2.0102 Epoch 242/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9884 - loss: 0.1170 - val_accuracy: 0.5200 - val_loss: 1.9741 Epoch 243/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9635 - loss: 0.1071 - val_accuracy: 0.5200 - val_loss: 1.9609 Epoch 244/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9629 - loss: 0.1138 - val_accuracy: 0.5200 - val_loss: 1.9346 Epoch 245/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9863 - loss: 0.1080 - val_accuracy: 0.4800 - val_loss: 1.9111 Epoch 246/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9196 - loss: 0.1743 - val_accuracy: 0.4800 - val_loss: 1.8706 Epoch 247/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9381 - loss: 0.2220 - val_accuracy: 0.4800 - val_loss: 1.8619 Epoch 248/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9863 - loss: 0.0968 - val_accuracy: 0.4800 - val_loss: 1.9104 Epoch 249/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9876 - loss: 0.0914 - val_accuracy: 0.4800 - val_loss: 1.9224 Epoch 250/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 0.9344 - loss: 0.1362 - val_accuracy: 0.4800 - val_loss: 1.9127 Epoch 251/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.9842 - loss: 0.0840 - val_accuracy: 0.4800 - val_loss: 1.8962 Epoch 252/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9594 - loss: 0.1806 - val_accuracy: 0.4800 - val_loss: 1.9164 Epoch 253/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9710 - loss: 0.1100 - val_accuracy: 0.4800 - val_loss: 1.9299 Epoch 254/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 1.0000 - loss: 0.0979 - val_accuracy: 0.4800 - val_loss: 1.9448 Epoch 255/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9945 - loss: 0.1164 - val_accuracy: 0.4800 - val_loss: 1.9665 Epoch 256/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9637 - loss: 0.1083 - val_accuracy: 0.4400 - val_loss: 2.0186 Epoch 257/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9602 - loss: 0.1259 - val_accuracy: 0.4400 - val_loss: 2.0721 Epoch 258/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 35ms/step - accuracy: 0.9822 - loss: 0.1219 - val_accuracy: 0.4400 - val_loss: 2.1088 Epoch 259/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9945 - loss: 0.0757 - val_accuracy: 0.4400 - val_loss: 2.1283 Epoch 260/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 0.9340 - loss: 0.1783 - val_accuracy: 0.4400 - val_loss: 2.1650 Epoch 261/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9554 - loss: 0.1607 - val_accuracy: 0.4400 - val_loss: 2.2219 Epoch 262/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9705 - loss: 0.1281 - val_accuracy: 0.4800 - val_loss: 2.2508 Epoch 263/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 33ms/step - accuracy: 0.9706 - loss: 0.1552 - val_accuracy: 0.4400 - val_loss: 2.2681 Epoch 264/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9747 - loss: 0.1009 - val_accuracy: 0.4400 - val_loss: 2.2611 Epoch 265/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9644 - loss: 0.1111 - val_accuracy: 0.4400 - val_loss: 2.2453 Epoch 266/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9491 - loss: 0.1557 - val_accuracy: 0.4400 - val_loss: 2.2341 Epoch 267/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 0.9615 - loss: 0.1102 - val_accuracy: 0.4400 - val_loss: 2.2281 Epoch 268/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.9842 - loss: 0.0795 - val_accuracy: 0.4400 - val_loss: 2.2207 Epoch 269/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.9292 - loss: 0.1738 - val_accuracy: 0.4000 - val_loss: 2.2206 Epoch 270/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 0.9713 - loss: 0.1195 - val_accuracy: 0.4000 - val_loss: 2.2123 Epoch 271/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9945 - loss: 0.0831 - val_accuracy: 0.4000 - val_loss: 2.2044 Epoch 272/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 2s 442ms/step - accuracy: 1.0000 - loss: 0.0836 - val_accuracy: 0.4000 - val_loss: 2.1910 Epoch 273/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 1s 49ms/step - accuracy: 0.9918 - loss: 0.0675 - val_accuracy: 0.4000 - val_loss: 2.2084 Epoch 274/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.9726 - loss: 0.0856 - val_accuracy: 0.4400 - val_loss: 2.2216 Epoch 275/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 0.9863 - loss: 0.0882 - val_accuracy: 0.4400 - val_loss: 2.2113 Epoch 276/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 1s 101ms/step - accuracy: 0.9918 - loss: 0.0969 - val_accuracy: 0.4400 - val_loss: 2.2084 Epoch 277/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 0.9918 - loss: 0.0748 - val_accuracy: 0.4400 - val_loss: 2.1908 Epoch 278/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 0.9752 - loss: 0.0963 - val_accuracy: 0.4400 - val_loss: 2.1743 Epoch 279/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 0.9657 - loss: 0.1162 - val_accuracy: 0.4400 - val_loss: 2.1678 Epoch 280/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.9793 - loss: 0.1745 - val_accuracy: 0.4400 - val_loss: 2.1788 Epoch 281/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 1s 63ms/step - accuracy: 0.9918 - loss: 0.0517 - val_accuracy: 0.4400 - val_loss: 2.1730 Epoch 282/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 0.9768 - loss: 0.0833 - val_accuracy: 0.4400 - val_loss: 2.1782 Epoch 283/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9793 - loss: 0.1135 - val_accuracy: 0.4800 - val_loss: 2.1915 Epoch 284/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 0.9945 - loss: 0.0743 - val_accuracy: 0.4800 - val_loss: 2.1888 Epoch 285/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 34ms/step - accuracy: 0.9739 - loss: 0.0826 - val_accuracy: 0.4800 - val_loss: 2.2076 Epoch 286/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9623 - loss: 0.1132 - val_accuracy: 0.4800 - val_loss: 2.2208 Epoch 287/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 0.9635 - loss: 0.1063 - val_accuracy: 0.4400 - val_loss: 2.1842 Epoch 288/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9685 - loss: 0.1178 - val_accuracy: 0.4400 - val_loss: 2.1927 Epoch 289/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9739 - loss: 0.0678 - val_accuracy: 0.4400 - val_loss: 2.2109 Epoch 290/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 0.9793 - loss: 0.1054 - val_accuracy: 0.4400 - val_loss: 2.2429 Epoch 291/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.0708 - val_accuracy: 0.4800 - val_loss: 2.2582 Epoch 292/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 0.9490 - loss: 0.1120 - val_accuracy: 0.4800 - val_loss: 2.2430 Epoch 293/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 64ms/step - accuracy: 0.9568 - loss: 0.1357 - val_accuracy: 0.4800 - val_loss: 2.2467 Epoch 294/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 1.0000 - loss: 0.0650 - val_accuracy: 0.4800 - val_loss: 2.2414 Epoch 295/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.9966 - loss: 0.0592 - val_accuracy: 0.4800 - val_loss: 2.2279 Epoch 296/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 90ms/step - accuracy: 0.9670 - loss: 0.1184 - val_accuracy: 0.5200 - val_loss: 2.1991 Epoch 297/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 0.0659 - val_accuracy: 0.5200 - val_loss: 2.1816 Epoch 298/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 0.9586 - loss: 0.1058 - val_accuracy: 0.5200 - val_loss: 2.1999 Epoch 299/300 5/5 ━━━��━━━━━━━━━━━━━━━━ 0s 37ms/step - accuracy: 0.9863 - loss: 0.0953 - val_accuracy: 0.5200 - val_loss: 2.2400 Epoch 300/300 5/5 ━━━���━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 0.9822 - loss: 0.1217 - val_accuracy: 0.5200 - val_loss: 2.2763 16/16 ━━��━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.6417 - loss: 1.2759
WARNING:absl:You are saving your model as an HDF5 file via `model.save()` or `keras.saving.save_model(model)`. This file format is considered legacy. We recommend using instead the native Keras format, e.g. `model.save('my_model.keras')` or `keras.saving.save_model(model, 'my_model.keras')`.
Results - random forest Accuracy = 0.548, MAE = 0.955, Chance = 0.167 Results - ANN Test score: 1.1399775743484497 Test accuracy: 0.7419354915618896
[INFO] == Command exit (modification check follows) =====
get(ok): data/raw/a.npy (file) [from origin...] get(ok): data/raw/participants.csv (file) [from origin...] run(ok): /Users/peerherholz/google_drive/GitHub/repronim_ML/repronim_ml_showcase/ml_showcase (dataset) [/Users/peerherholz/anaconda3/envs/repron...] add(ok): .keras/keras.json (file) add(ok): ANN.h5 (file) add(ok): metrics.json (file) add(ok): random_forest.joblib (file) save(ok): . (dataset) action summary: add (ok: 4) get (notneeded: 3, ok: 2) run (ok: 1) save (notneeded: 1, ok: 1)
Not only can we obtain reproducible
results via the combination of software containers
and seed
ing:
%%bash
cd repronim_ml_showcase/ml_showcase/
cat metrics.json
{"accuracy": 0.548, "MAE": 0.955, "Chance": 0.167, "Test score": 1.1399775743484497, "Test accuracy": 0.7419354915618896}
but we also get a full log
of everything that happened in and to our dataset
.
%%bash
cd repronim_ml_showcase/ml_showcase/
git log
commit 4df3c2dc52f0eeaa032bc511734ad5b07f3db3c7 Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 18:07:06 2025 +0800 [DATALAD RUNCMD] First run of ML analyses === Do not change lines below === { "chain": [], "cmd": "/Users/peerherholz/anaconda3/envs/repronim_ml/bin/python3.1 -m datalad_container.adapters.docker run .datalad/environments/repronim-ml-container/image code/ml_reproducibility.py", "dsid": "a43e7287-28c4-49d7-b21a-6e56835f7276", "exit": 0, "extra_inputs": [ ".datalad/environments/repronim-ml-container/image" ], "inputs": [ "data/raw/" ], "outputs": [ "metrics.json", "random_forest.joblib", "ANN.h5" ], "pwd": "." } ^^^ Do not change lines above ^^^ commit 93986fdf93b5132b6d0862d01c573963f9466c53 Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 18:04:58 2025 +0800 add random forest & ANN script commit 3e890457c28d947720d5b7dc7246896b8962ef4d Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 18:04:53 2025 +0800 [DATALAD] Configure containerized environment 'repronim-ml-container' commit d60ea93b3117e19e10efda6be7a5450335933216 Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 17:59:45 2025 +0800 [DATALAD] Added subdataset commit 5315b9c5dec9f8f2543b813ba6fed6f59f9688b7 Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 17:59:40 2025 +0800 Apply YODA dataset setup commit 1a23084f7cd2a238d9d1e73903e7f84949718bdf Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 17:59:38 2025 +0800 Instruct annex to add text files to Git commit 469332868379d11b986f0b752667eaa9a326dadf Author: Peer Herholz <herholz.peer@gmail.com> Date: Tue Jun 17 17:59:35 2025 +0800 [DATALAD] new dataset
This would allow to rerun a specific analyses
using the same computational environment
and data
, as well as track
quite a bit of variability introduced by changes
!
There are also other tools
out there, more tailored towards machine learning
. For example, mlflow:
https://mlflow.org/docs/latest/tracking.html
adapted from [Martina Vilas](https://doi.org/10.5281/zenodo.4740053)</small>
https://mlflow.org/docs/latest/projects.html
adapted from [Martina Vilas](https://doi.org/10.5281/zenodo.4740053)</small>
Now that we have addressed a good amount of the introduced challenges
to reproducible machine learning
, including software
, algorithms
/practices
/processes
and data
, we already achieved a fair level of reproducibility
. However, we can even do more!
data
& models
¶reproducibility
and FAIR
-ness, data
and models
, basically everything involved in the analyses
need to sharedanalyses
or adapt them, reduces computational costs
data
and different types thereof, both openly and with restricted accessdata
at handFor example, we can store all things on GitHub
, including the software
/information re computing environments
:
with the computing environment
being stored/made available on dockerhub
:
the code
, i.e. machine learning analyses scripts
:
and the pre-trained models
:
speaking of which: there are amazing projects
out there that bring this to the next level, e.g. nobrainer:
Adjacent to that, it's of course required to share
as much about the analyses
as possible when including it in a publication
, etc.. There are cool & helpful checklists out there:
https://www.cs.mcgill.ca/~jpineau/ReproducibilityChecklist.pdf
adapted from [Martina Vilas](https://doi.org/10.5281/zenodo.4740053)</small>
See? There are many things we can do to make our machine learning analyses
more reproducible
. Unfortunately, all of this won't guarantee full reproducibility
. Thus, always make sure to check things constantly!
However, for now let's briefly summarize what we talked about!
Reproducible machine learning
- a summary¶There are also many other very useful resources out there!
Isdahl & Gunderson (2019)
https://www.cs.mcgill.ca/~ksinha4/practices_for_reproducibility/
adapted from [Martina Vilas](https://doi.org/10.5281/zenodo.4740053)</small>
via [GIPHY](https://media3.giphy.com/media/iJ2cRDeQkcPXZiHh53/giphy.gif)</small></small></small>
Thank you all for your attention! If you have any questions, please don't hesitate to ask or contact me via herholz do peer at gmail dot com or via "social media":
@peerherholz
Make sure to also check the ReproNim
website: https://www.repronim.org/