Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

pickling python objects to google cloud storage

I've been pickling the objects to filesystem and reading them back when needed to work with those objects. Currently I've this code for that purpose.

def pickle(self, directory, filename):
    if not os.path.exists(directory):
        os.makedirs(directory)
    with open(directory + '/' + filename, 'wb') as handle:
        pickle.dump(self, handle)

@staticmethod
def load(filename):
    with open(filename, 'rb') as handle:
        element = pickle.load(handle)
    return element

Now I'm moving my applictation(django) to Google app engine and figured that app engine does not allow me to write to file system. Google cloud storage seemed my only choice but I could not understand how could I pickle my objects as cloud storage objects and read them back to create the original python object.

like image 416
Jo Kachikaran Avatar asked Jan 05 '23 22:01

Jo Kachikaran


1 Answers

For Python 3 users, you can use gcsfs library from Dask creator to solve your issue.

Example reading:

import gcsfs

fs = gcsfs.GCSFileSystem(project='my-google-project')
fs.ls('my-bucket')
>>> ['my-file.txt']
with fs.open('my-bucket/my-file.txt', 'rb') as f:
    print(f.read())

It basically is identical with pickle, though:

with fs.open(directory + '/' + filename, 'wb') as handle:
        pickle.dump(shandle)

To read, this is similar, but replace wb by rb and dump with load:

with fs.open(directory + '/' + filename, 'rb') as handle:
        pickle.load(handle)
like image 124
LaSul Avatar answered Jan 16 '23 03:01

LaSul