I've made a Keras LSTM model that reads in binary target values and is supposed to output binary predictions. However, the predictions aren't binary. A sample of my X and Y values is below:
X Y
5.06 0
4.09 1
4.72 0
4.57 0
4.44 1
6.98 1
What I'm trying to predict is if Xt+1 is going to be higher or lower than Xt. The Y value for Xt is 1 if Xt+1 is greater than Xt. My training X values are in the shape (932, 100, 1) for 932 samples, 100 for the "look back" sequence, and 1 for the features. The predictions I get look like:
Predictions
.512
.514
.513
I'm thinking these might be probabilities as my model accuracy is around 51%. Any ideas as to how to get them to be binary? Full model code is below:
# Defining network architecture
def build_model(layers):
model = Sequential()
model.add(LSTM(
input_dim=layers[0],
output_dim=layers[1],
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(
layers[2],
return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(
output_dim=layers[3]))
model.add(Activation("sigmoid"))
start = time.time()
model.compile(loss="binary_crossentropy", optimizer="rmsprop",metrics=['accuracy'],class_mode="binary")
print("> Compilation Time : ", time.time() - start)
return model
# Compiling model
model = build_model([1, 100, 500, 1])
It's normal behavior.
There is no "binary" in neural networks, but a continuous function within limits.
Only with continuous functions a model can train and learn using "stochastic gradient descent".
For trying to achieve binary results, we use the sigmoid function, which goes from 0 to 1. But initially, your model is not trained, all its "weights" are sort of initialised randomly. The result is indeed results tending to mean values, which are 0.5 in sigmoid functions.
All you need is to train your model with enough data for enough epochs, so the results will gradually approach (but never hit) 0 or 1 (or whatever targets "y" you have in your training data)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With