I have the following function which is supposed to autoencode my data.
My data can be thought of as an image of length 100, width 2, and it has 2 channels (100, 2, 2)
def construct_ae(input_shape):
encoder_input = tf.placeholder(tf.float32, input_shape, name='x')
with tf.variable_scope("encoder"):
flattened = tf.layers.flatten(encoder_input)
e_fc_1 = tf.layers.dense(flattened, units=150, activation=tf.nn.relu)
encoded = tf.layers.dense(e_fc_1, units=75, activation=None)
with tf.variable_scope("decoder"):
d_fc_1 = tf.layers.dense(encoded, 150, activation=tf.nn.relu)
d_fc_2 = tf.layers.dense(d_fc_1, 400, activation=None)
decoded = tf.reshape(d_fc_2, input_shape)
with tf.variable_scope('training'):
loss = tf.losses.mean_squared_error(labels=encoder_input, predictions=decoded)
cost = tf.reduce_mean(loss)
optimizer = tf.train.AdamOptimizer(learning_rate=0.001).minimize(cost)
return optimizer
I'm running into the issue where my cost is on the order of 1.1e9, and it's not decreasing over time
I visualized the gradients (removed the code because it would just clutter things) and I think something is wrong there? But I'm not sure
Questions
1) Does anything in the construction of the network look incorrect?
2) Does the data need to be normalized between 0-1?
3) I hit NaNs sometimes when I try increasing the learning rate to 1. Is that indicative of anything?
4) I think I should probably use a CNN but I ran into the same issues so I thought I'd move to an FC since it's likely easier to debug.
5) I imagine I'm using the wrong loss function but I can't really find any papers regarding the right loss to use. If anyone can direct me to one I'd be very appreciative
As far as the high starting error is concerned; it all depends on your parameters' initialization. A good initialization technique gets you starting errors that are not too far from a desired minima. However, the default random or zeros-based initialization almost always leads to such scenarios.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With