Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there any way to get variable importance with Keras?

I am looking for a proper or best way to get variable importance in a Neural Network created with Keras. The way I currently do it is I just take the weights (not the biases) of the variables in the first layer with the assumption that more important variables will have higher weights in the first layer. Is there another/better way of doing it?

like image 974
user1367204 Avatar asked May 22 '17 17:05

user1367204


People also ask

What is feature importance in machine learning?

Feature Importance refers to techniques that calculate a score for all the input features for a given model — the scores simply represent the “importance” of each feature. A higher score means that the specific feature will have a larger effect on the model that is being used to predict a certain variable.

What is permutation importance?

Feature permutation importance measures the predictive value of a feature for any black box estimator, classifier, or regressor. It does this by evaluating how the prediction error increases when a feature is not available. Any scoring metric can be used to measure the prediction error.


1 Answers

Since everything will be mixed up along the network, the first layer alone can't tell you about the importance of each variable. The following layers can also increase or decrease their importance, and even make one variable affect the importance of another variable. Every single neuron in the first layer itself will give each variable a different importance too, so it's not something that straightforward.

I suggest you do model.predict(inputs) using inputs containing arrays of zeros, making only the variable you want to study be 1 in the input.

That way, you see the result for each variable alone. Even though, this will still not help you with the cases where one variable increases the importance of another variable.

like image 159
Daniel Möller Avatar answered Sep 19 '22 03:09

Daniel Möller