I am using a pipeline very similar to the one given in this example :
>>> text_clf = Pipeline([('vect', CountVectorizer()),
... ('tfidf', TfidfTransformer()),
... ('clf', MultinomialNB()),
... ])
over which I use GridSearchCV
to find the best estimators over a parameter grid.
However, I would like to get the column names of my training set with the get_feature_names()
method from CountVectorizer()
. Is this possible without implementing CountVectorizer()
outside the pipeline?
The only difference is that make_pipeline generates names for steps automatically.
In order to address this, scikit-learn provides utilities for the most common ways to extract numerical features from text content, namely: tokenizing strings and giving an integer id for each possible token, for instance by using white-spaces and punctuation as token separators.
Using the get_params()
function, you can get access at the various parts of the pipeline and their respective internal parameters. Here's an example of accessing 'vect'
text_clf = Pipeline([('vect', CountVectorizer()),
('tfidf', TfidfTransformer()),
('clf', MultinomialNB())]
print text_clf.get_params()['vect']
yields (for me)
CountVectorizer(analyzer=u'word', binary=False, decode_error=u'strict',
dtype=<type 'numpy.int64'>, encoding=u'utf-8', input=u'content',
lowercase=True, max_df=1.0, max_features=None, min_df=1,
ngram_range=(1, 1), preprocessor=None, stop_words=None,
strip_accents=None, token_pattern=u'(?u)\\b\\w\\w+\\b',
tokenizer=None, vocabulary=None)
I haven't fitted the pipeline to any data in this example, so calling get_feature_names()
at this point will return an error.
just for reference
The estimators of a pipeline are stored as a list in the steps attribute:
>>>
>>> clf.steps[0]
('reduce_dim', PCA(copy=True, n_components=None, whiten=False))
and as a dict in named_steps:
>>>
>>> clf.named_steps['reduce_dim']
PCA(copy=True, n_components=None, whiten=False)
from http://scikit-learn.org/stable/modules/pipeline.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With