Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to pickle functions/classes defined in __main__ (python)

Tags:

python

pickle

I would like to be able to pickle a function or class from within __main__, with the obvious problem (mentioned in other posts) that the pickled function/class is in the __main__ namespace and unpickling in another script/module will fail.

I have the following solution which works, is there a reason this should not be done?

The following is in myscript.py:

import myscript
import pickle

if __name__ == "__main__":               

    print pickle.dumps(myscript.myclass())

else:

    class myclass:
        pass

edit: The unpickling would be done in a script/module that has access to myscript.py and can do an import myscript. The aim is to use a solution like parallel python to call functions remotely, and be able to write a short, standalone script that contains the functions/classes that can be accessed remotely.

like image 888
andrew Avatar asked Aug 08 '12 14:08

andrew


People also ask

Can you pickle Python functions?

In Python, you can use pickle to serialize (deserialize) an object structure into (from) a byte stream. Here are best practices for secure Python pickling. Pickle in Python is primarily used in serializing and deserializing a Python object structure.

What is __ class __ in Python?

__class__ is an attribute on the object that refers to the class from which the object was created. a. __class__ # Output: <class 'int'> b. __class__ # Output: <class 'float'> After simple data types, let's now understand the type function and __class__ attribute with the help of a user-defined class, Human .

Can pickle store class objects in Python?

The pickle module can store things such as data types such as booleans, strings, and byte arrays, lists, dictionaries, functions, and more.

How do you pickle in Python?

Pickling FilesTo use pickle, start by importing it in Python. To pickle this dictionary, you first need to specify the name of the file you will write it to, which is dogs in this case. Note that the file does not have an extension. To open the file for writing, simply use the open() function.


2 Answers

Pickle seems to look at the main scope for definitions of classes and functions. From inside the module you're unpickling from, try this:

import myscript
import __main__
__main__.myclass = myscript.myclass
#unpickle anywhere after this
like image 170
Andrew Avatar answered Oct 11 '22 15:10

Andrew


You can get a better handle on global objects by importing __main__, and using the methods available in that module. This is what dill does in order to serialize almost anything in python. Basically, when dill serializes an interactively defined function, it uses some name mangling on __main__ on both the serialization and deserialization side that makes __main__ a valid module.

>>> import dill
>>> 
>>> def bar(x):
...   return foo(x) + x
... 
>>> def foo(x):
...   return x**2
... 
>>> bar(3)
12
>>> 
>>> _bar = dill.loads(dill.dumps(bar))
>>> _bar(3)
12

Actually, dill registers it's types into the pickle registry, so if you have some black box code that uses pickle and you can't really edit it, then just importing dill can magically make it work without monkeypatching the 3rd party code.

Or, if you want the whole interpreter session sent over as an "python image", dill can do that too.

>>> # continuing from above
>>> dill.dump_session('foobar.pkl')
>>>
>>> ^D
dude@sakurai>$ python
Python 2.7.5 (default, Sep 30 2013, 20:15:49) 
[GCC 4.2.1 (Apple Inc. build 5566)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import dill
>>> dill.load_session('foobar.pkl')
>>> _bar(3)
12

You can easily send the image across ssh to another computer, and start where you left off there as long as there's version compatibility of pickle and the usual caveats about python changing and things being installed.

I actually use dill to serialize objects and send them across parallel resources with parallel python, multiprocessing, and mpi4py. I roll these up conveniently into the pathos package (and pyina for MPI), which provides a uniform map interface for different parallel batch processing backends.

>>> # continued from above
>>> from pathos.multiprocessing import ProcessingPool as Pool
>>> Pool(4).map(foo, range(10))
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
>>>
>>> from pyina.launchers import MpiPool
>>> MpiPool(4).map(foo, range(10))
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81]

There are also non-blocking and iterative maps as well as non-parallel pipe connections. I also have a pathos module for pp, however, it is somewhat unstable for functions defined in __main__. I'm working on improving that. If you like, fork the code on github and help make the pp better for functions defined in __main__. The reason pp doesn't pickle well is that pp does it's serialization tricks through using temporary file objects and reading the interpreter session's history... so it doesn't serialize objects in the same way that multiprocessing or mpi4py do. I have a dill module dill.source that seamlessly does the same type of pickling that pp uses, but it's rather new.

like image 26
Mike McKerns Avatar answered Oct 11 '22 16:10

Mike McKerns