I have got an output using sparse matrix in python, i need to store this sparse matrix in my hard disk, how can i do it? if i should create a database then how should i do?? this is my code:
import nltk
import cPickle
import numpy
from scipy.sparse import lil_matrix
from nltk.corpus import wordnet as wn
from nltk.corpus import brown
f = open('spmatrix.pkl','wb')
def markov(L):
count=0
c=len(text1)
for i in range(0,c-2):
h=L.index(text1[i])
k=L.index(text1[i+1])
mat[h,k]=mat[h,k]+1//matrix
cPickle.dump(mat,f,-1)
text = [w for g in brown.categories() for w in brown.words(categories=g)]
text1=text[1:500]
arr=set(text1)
arr=list(arr)
mat=lil_matrix((len(arr),len(arr)))
markov(arr)
f.close()
I need to store this "mat" in a file and should access the value of the matrix using the co-ordinates..
result of the sparse matrix is like this: `the result of sparse matrix are like this:
(173, 168) 2.0 (173, 169) 1.0 (173, 172) 1.0 (173, 237) 4.0 (174, 231) 1.0 (175, 141) 1.0 (176, 195) 1.0
but when i store it into a file and read the same i'm getting it like this:
(0, 68) 1.0 (0, 77) 1.0 (0, 95) 1.0 (0, 100) 1.0 (0, 103) 1.0 (0, 110) 1.0 (0, 112) 2.0 (0, 132) 1.0 (0, 133) 2.0 (0, 139) 1.0 (0, 146) 2.0 (0, 156) 1.0 (0, 157) 1.0 (0, 185) 1.0
Assuming you have a numpy matrix
or ndarray
, which your question and tags imply, there is a dump
method and load
function you can use:
your_matrix.dump('output.mat')
another_matrix = numpy.load('output.mat')
Note: This answer is in response to the revise question that now provides code.
You should not call cPickle.dump()
in your function. Create the sparse matrix and then dump its contents to the file.
Try:
def markov(L):
count=0
c=len(text1)
for i in range(0,c-2):
h=L.index(text1[i])
k=L.index(text1[i+1])
mat[h,k]=mat[h,k]+1 #matrix
text = [w for g in brown.categories() for w in brown.words(categories=g)]
text1=text[1:500]
arr=set(text1)
arr=list(arr)
mat=lil_matrix((len(arr),len(arr)))
markov(arr)
f = open('spmatrix.pkl','wb')
cPickle.dump(mat,f,-1)
f.close()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With