Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can neural networks learn functions with a variable number of inputs?

A simple example: Given an input sequence, I want the neural network to output the median of the sequence. The problem is, if a neural network learnt to compute the median of n inputs, how can it compute the median of even more inputs? I know that recurrent neural networks can learn functions like max and parity over a sequence, but computing these functions only requires constant memory. What if the memory requirement grows with the input size like computing the median?

This is a follow up question on How are neural networks used when the number of inputs could be variable?.

like image 249
Hanhan Li Avatar asked Jun 29 '15 18:06

Hanhan Li


2 Answers

One idea I had is the following: treating each weight as a function of the number of inputs instead of a fixed value. So a weight may have many parameters that define a function, and we train these parameters. For example, if we want the neural network to compute the average of n inputs, we would like each weight function behaves like 1/n. Again, average per se can be computed using recurrent neural networks or hidden markov model, but I was hoping this kind of approaches can be generalized to solve certain problems where memory requirement grows.

like image 180
Hanhan Li Avatar answered Nov 13 '22 12:11

Hanhan Li


If a neural network learnt to compute the median of n inputs, how can it compute the median of even more inputs?

First of all, you should understand the use of a neural network. We, generally use the neural network in problems where a mathematical solution is not possible. In this problem, use of NN is not significant/ unadvisable.

There are other problems of such nature, like forecasting, in which continuous data arrives over time.

One solution to such problem can be Hidden Markov Model (HMM). But again, such models depends on the correlation between input over a period of time. So This model is not efficient for problems where the input is completely random.

So, If input is completely random and memory requirement grows

There is nothing much you can do about it, one possible solution could be growing your memory size.

Just remember one thing, NN and similar models of machine learning aims to extract meaningful information from the data. if data is just some random values then all models will generate some random output.

like image 29
saurabh agarwal Avatar answered Nov 13 '22 11:11

saurabh agarwal