I have a problem which deals with predicting two outputs when given a vector of predictors. Assume that a predictor vector looks like x1, y1, att1, att2, ..., attn, which says x1, y1 are coordinates and att's are the other attributes attached to the occurrence of x1, y1 coordinates. Based on this predictor set I want to predict x2, y2. This is a time series problem, which I am trying to solve using multiple regresssion. My question is how do I setup keras, which can give me 2 outputs in the final layer.
Neural network models can be configured for multi-output regression tasks.
Multi-output classification is a type of machine learning that predicts multiple outputs simultaneously. In multi-output classification, the model will give two or more outputs after making any prediction. In other types of classifications, the model usually predicts only a single output.
Many machine learning algorithms are designed for predicting a single numeric value, referred to simply as regression. Some algorithms do support multioutput regression inherently, such as linear regression and decision trees.
from keras.models import Model from keras.layers import * #inp is a "tensor", that can be passed when calling other layers to produce an output inp = Input((10,)) #supposing you have ten numeric values as input #here, SomeLayer() is defining a layer, #and calling it with (inp) produces the output tensor x x = SomeLayer(blablabla)(inp) x = SomeOtherLayer(blablabla)(x) #here, I just replace x, because this intermediate output is not interesting to keep #here, I want to keep the two different outputs for defining the model #notice that both left and right are called with the same input x, creating a fork out1 = LeftSideLastLayer(balbalba)(x) out2 = RightSideLastLayer(banblabala)(x) #here, you define which path you will follow in the graph you've drawn with layers #notice the two outputs passed in a list, telling the model I want it to have two outputs. model = Model(inp, [out1,out2]) model.compile(optimizer = ...., loss = ....) #loss can be one for both sides or a list with different loss functions for out1 and out2 model.fit(inputData,[outputYLeft, outputYRight], epochs=..., batch_size=...)
You can make a model with multiple output with
the Functional API
by subclassing tf.keras.Model.
Here's an example of dual outputs (regression and classification) on the Iris Dataset, using the Functional API:
from sklearn.datasets import load_iris from tensorflow.keras.layers import Dense from tensorflow.keras import Input, Model import tensorflow as tf data, target = load_iris(return_X_y=True) X = data[:, (0, 1, 2)] Y = data[:, 3] Z = target inputs = Input(shape=(3,), name='input') x = Dense(16, activation='relu', name='16')(inputs) x = Dense(32, activation='relu', name='32')(x) output1 = Dense(1, name='cont_out')(x) output2 = Dense(3, activation='softmax', name='cat_out')(x) model = Model(inputs=inputs, outputs=[output1, output2]) model.compile(loss={'cont_out': 'mean_absolute_error', 'cat_out': 'sparse_categorical_crossentropy'}, optimizer='adam', metrics={'cat_out': tf.metrics.SparseCategoricalAccuracy(name='acc')}) history = model.fit(X, {'cont_out': Y, 'cat_out': Z}, epochs=10, batch_size=8) Here's a simplified version:
from sklearn.datasets import load_iris from tensorflow.keras.layers import Dense from tensorflow.keras import Input, Model data, target = load_iris(return_X_y=True) X = data[:, (0, 1, 2)] Y = data[:, 3] Z = target inputs = Input(shape=(3,)) x = Dense(16, activation='relu')(inputs) x = Dense(32, activation='relu')(x) output1 = Dense(1)(x) output2 = Dense(3, activation='softmax')(x) model = Model(inputs=inputs, outputs=[output1, output2]) model.compile(loss=['mae', 'sparse_categorical_crossentropy'], optimizer='adam') history = model.fit(X, [Y, Z], epochs=10, batch_size=8) Here's the same example, subclassing tf.keras.Model and with a custom training loop:
import tensorflow as tf from tensorflow.keras.layers import Dense from tensorflow.keras import Model from sklearn.datasets import load_iris tf.keras.backend.set_floatx('float64') iris, target = load_iris(return_X_y=True) X = iris[:, :3] y = iris[:, 3] z = target ds = tf.data.Dataset.from_tensor_slices((X, y, z)).shuffle(150).batch(8) class MyModel(Model): def __init__(self): super(MyModel, self).__init__() self.d0 = Dense(16, activation='relu') self.d1 = Dense(32, activation='relu') self.d2 = Dense(1) self.d3 = Dense(3, activation='softmax') def call(self, x, training=None, **kwargs): x = self.d0(x) x = self.d1(x) a = self.d2(x) b = self.d3(x) return a, b model = MyModel() loss_obj_reg = tf.keras.losses.MeanAbsoluteError() loss_obj_cat = tf.keras.losses.SparseCategoricalCrossentropy() optimizer = tf.keras.optimizers.Adam(learning_rate=1e-3) loss_reg = tf.keras.metrics.Mean(name='regression loss') loss_cat = tf.keras.metrics.Mean(name='categorical loss') error_reg = tf.keras.metrics.MeanAbsoluteError() error_cat = tf.keras.metrics.SparseCategoricalAccuracy() @tf.function def train_step(inputs, y_reg, y_cat): with tf.GradientTape() as tape: pred_reg, pred_cat = model(inputs) reg_loss = loss_obj_reg(y_reg, pred_reg) cat_loss = loss_obj_cat(y_cat, pred_cat) gradients = tape.gradient([reg_loss, cat_loss], model.trainable_variables) optimizer.apply_gradients(zip(gradients, model.trainable_variables)) loss_reg(reg_loss) loss_cat(cat_loss) error_reg(y_reg, pred_reg) error_cat(y_cat, pred_cat) for epoch in range(50): for xx, yy, zz in ds: train_step(xx, yy, zz) template = 'Epoch {:>2}, SCCE: {:>5.2f},' \ ' MAE: {:>4.2f}, SAcc: {:>5.1%}' print(template.format(epoch+1, loss_cat.result(), error_reg.result(), error_cat.result())) loss_reg.reset_states() loss_cat.reset_states() error_reg.reset_states() error_cat.reset_states()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With