novice here. I'm walking myself through the nltk book and another Python intro book. I came across most_common() earlier on in the nltk book, and though I couldn't have it working nor find a solution at the time, I created a little function that did the trick in that particular exercise and kept going. Now I need it again, but I don't think I'll be able to get around it as easily (the exercise is about the most common word lengths in a particular text), plus I know I will find it again in later examples, and I'd like to be able to follow, since, as I said, I'm a novice.
In theory, I should be able to do this:
fdist = FreqDist(len(w) for w in text1)
fdist.most_common()
[(3, 50223), (1, 47933), (4, 42345), (2, 38513) ...
However, Python tells me this:
AttributeError: 'FreqDist' object has no attribute 'most_common'
I found that most_common() is an attribute of counter objects (http://docs.python.org/2/library/collections.html) and (http://docs.python.org/dev/library/collections#collections.Counter). I get the idea that perhaps I should be importing something (a module?), but what I have tried importing either doesn't work (not defined or non-existent messages) or does not contain it -I tried
import collections
no error there, but most_common() is not listed when I enter dir(collections) or dir(builtins).
I have both 2.7 and 3.0 installed (windows most of the time, occasionally working in my virtual machine with ubuntu). I will keep searching, but will greatly appreciate your input. This feels kind of basic but I'm learning and can't make it out myself at least for now. Again, thanks a lot.
nltk.probability.FreqDist
is not collections.Counter
.
Use items
method to get a list of items in sorted order (most frequent first).
>>> from nltk.probability import FreqDist
>>> dist = FreqDist([1, 2, 1, 2, 1])
>>> dist.items()
[(1, 3), (2, 2)]
Or just use collections.Counter
:
>>> from collections import Counter
>>> c = Counter([1, 2, 1, 2, 1])
>>> c.most_common()
[(1, 3), (2, 2)]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With