Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use word2vec2tensor in gensim?

I am following the following gensim tutorial to transform my word2vec model to tensor. Link to the tutorial: https://radimrehurek.com/gensim/scripts/word2vec2tensor.html

More specifically, I ran the following command

python -m gensim.scripts.word2vec2tensor -i C:\Users\Emi\Desktop\word2vec\model_name -o C:\Users\Emi\Desktop\word2vec

However, I get the following error for the above command.

UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 0: invalid start byte

When I use model.wv.save_word2vec_format(model_name) to save my model (as mentioned in the following link: https://github.com/RaRe-Technologies/gensim/issues/1847) and then use the above command I get the following error.

ValueError: invalid vector on line 1 (is this really the text format?)

Just wondering if I have made any mistakes in the syntax of the commads. Please let me know how to resolve this issue.

I am happy to provide more details if needed.

like image 697
EmJ Avatar asked Oct 16 '22 07:10

EmJ


1 Answers

I was able to solve the issue by using the following code:

model = gensim.models.keyedvectors.KeyedVectors.load(file_name)

max_size = len(model.wv.vocab)-1
w2v = np.zeros((max_size,model.layer1_size))

if not os.path.exists('projections'):
    os.makedirs('projections')

with open("projections/metadata.tsv", 'w+') as file_metadata:

    for i, word in enumerate(model.wv.index2word[:max_size]):

        #store the embeddings of the word
        w2v[i] = model.wv[word]

        #write the word to a file 
        file_metadata.write(word + '\n')

sess = tf.InteractiveSession()
with tf.device("/cpu:0"):
    embedding = tf.Variable(w2v, trainable=False, name='embedding')
tf.global_variables_initializer().run()
saver = tf.train.Saver()
writer = tf.summary.FileWriter('projections', sess.graph)
config = projector.ProjectorConfig()
embed= config.embeddings.add()
embed.tensor_name = 'embedding'
embed.metadata_path = 'metadata.tsv'
projector.visualize_embeddings(writer, config)
saver.save(sess, 'projections/model.ckpt', global_step=max_size)
like image 137
EmJ Avatar answered Oct 21 '22 03:10

EmJ