I have a python instance running on a linux server. I have created a global array using global class. I want pass object of this class as a command line argument to a python function which I will run on a windows VM. How do I pass the object as a commandline argument in python? Or is there any better way to do it?
You can use json.dumps()
and json.loads()
or pickle.dumps()
and pickle.loads()
for this purpose:
>>> import json
>>> json.dumps(['Hi'])
'["Hi"]'
>>> json.loads(_)
['Hi']
>>> import pickle
>>> pickle.dumps(['Hi'])
b'\x80\x03]q\x00X\x02\x00\x00\x00Hiq\x01a.'
>>> pickle.loads(_)
['Hi']
Note that if you are trying to pass in a special class you will have to do some extra work; you'll need to have function to convert to and from the JSON format,1 while pickle will do things automatically but will still need access to the class.2
However, I think you'd be best off running a task execution server in the VM. While the primary focus of these server options is to allow scalability, they're quite good at the remote aspects as well. This abstracts away all of the communication and serialization solutions that, as @J.F. Sebastian said, you really don't need to reinvent.
Celery is probably the most commonly used task execution server library. It takes some work to set up, but is simple to use once configured: mark your function with a Celery decorator to make it a task object, start the worker on the VM, import the module, and call a class method with the same arguments that you would pass to the function itself.3 Once everything is working right, the Celery worker can be set up as a Windows service.4
# app.py (adapted from examples in the Celery Getting Started tutorial
from celery import Celery
app = Celery('tasks', broker='amqp://guest@localhost//')
@app.task
def my_function(a, b):
return a * b
# main.py
import app
result = app.my_function.delay(4, 5)
print result.get()
Sometimes though, Celery is just too much hassle. If you need to use third-party libraries from the function, you'll either have to import them in the function or have them installed on the Linux server as well, due to Celery's intuitive arrangement. And I've personally had troubles getting Celery set up in the first place.
A simpler alternative is TaskIt.5 (Full disclosure: I am the developer of TaskIt.) It uses a more traditional server-client connection style, so all that has to work is a standard TCP socket. By default it uses JSON to serialize objects, but pickle is also supported.
# server.py
from taskit.backend import BackEnd
def my_function(a, b):
return a * b
backend = BackEnd(dict(my_function=my_function))
backend.main()
# client.py
from taskit.frontend import FrontEnd
backend_addr = '127.0.0.1'
frontend = FrontEnd([backend_addr])
print frontend.work('my_function', 4, 5)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With