I've seen a bunch of similar questions on SO/elsewhere but none of the answers quite satisfy my needs, so I don't think this is a dup.
Also, I totally know how to implement this myself, but I'm trying not to have to re-invent the wheel.
Does anyone know any python packages which can perform levenshtein/other edit-distance comparing 2 lists of words (I've found a few), but also allow one to specify your own costs for insertion, deletion, substitution, and transpositions?
basically, I want the distances computed to be the number of edits on words in the sentences, not on the number of characters the sentences differ by.
I'm trying to replace a custom python extension module which is actually written in C, using python2's C api. I could re-write in either pure-python or cython, but I'd rather simply add a dependency to the project. The only problem is that this code allows one to specify your own costs for the various options, and I haven't found a package which allows this so far.
NLTK has the function named edit_distance. It calculates the Levenshtein distance between two strings. But it works good with lists of strings too:
import nltk
s1 = 'WAKA WAKA QB WTF BBBQ WAKA LOREM IPSUM WAKA'.split()
s2 = 'WAKA OMFG QB WTF WAKA WAKA LOREM IPSUM WAKA'.split()
print(s1)
print(s2)
print(nltk.edit_distance(s1, s2))
['WAKA', 'WAKA', 'QB', 'WTF', 'BBBQ', 'WAKA', 'LOREM', 'IPSUM', 'WAKA']
['WAKA', 'OMFG', 'QB', 'WTF', 'WAKA', 'WAKA', 'LOREM', 'IPSUM', 'WAKA']
2
Here is one library which is said to be fast, and computer various types of word distance, including Levenshtein:
https://pypi.org/project/python-Levenshtein/
You should also try Hamming distance, less memory and time consuming than Levenshtein.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With