I'm new to neural nets (just a disclaimer).
I have a regression problem of predicting the strength of concrete, based on 8 features. What I've done first, is rescaled the data using min-max normalization:
# Normalize data between 0 and 1
from sklearn.preprocessing import MinMaxScaler
min_max = MinMaxScaler()
dataframe2 = pd.DataFrame(min_max.fit_transform(dataframe), columns = dataframe.columns)
then converted the dataframe into numpy array and split it into X_train, y_train, X_test, y_test. Now here is the Keras code for the network itself:
from keras.models import Sequential
from keras.layers import Dense, Activation
#Set the params of the Neural Network
batch_size = 64
num_of_epochs = 40
hidden_layer_size = 256
model = Sequential()
model.add(Dense(hidden_layer_size, input_shape=(8, )))
model.add(Activation('relu'))
model.add(Dense(hidden_layer_size))
model.add(Activation('relu'))
model.add(Dense(hidden_layer_size))
model.add(Activation('relu'))
model.add(Dense(1))
model.add(Activation('linear'))
model.compile(loss='mean_squared_error', # using the mean squared error function
optimizer='adam', # using the Adam optimiser
metrics=['mae', 'mse']) # reporting the accuracy with mean absolute error and mean squared error
model.fit(X_train, y_train, # Train the model using the training set...
batch_size=batch_size, epochs=num_of_epochs,
verbose=0, validation_split=0.1)
# All predictions in one array
predictions = model.predict(X_test)
Questions:
predictions array will have all the values in the scaled format (between 0 and 1), but obviously I would need the predictions to be in their real values. How can I rescale those outputs back to the real values?
Is Min-Max or Z-Score standardization more appropriate for regression problems? What about this 'Batch-Normalization'?
Thank you,
As per the doc, the MinMaxScaler
class has an inverse_transform
method which does what you want:
inverse_transform(X): Undo the scaling of X according to feature_range.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With