I have some pre-processing of data that I've done in Python that I want to pass to a C program. The amount of data isn't tremendously large --- it's made of "packets" of the form (int, str), where the strings are strings of a uniform length (say ~80). I have maybe ~40 of these I need to pass. It's possible I might scale up these numbers eventually, but the strings lengths can be assumed to be <= 200, and the number of them can be assumed to be <= 4000.
I'm planning on using subprocess
within the Python to manage the execution of the C program, but I'm curious about what a "good" way of getting the data to the C program would be. One solution would be using temporary files, but this seems pretty hack-y (and it seems like I might have "too little" data for this to make sense). I could alternatively pass all the data as command-line arguments to subprocess.run
, but here it feels somewhat like I might have "too much" data for this to make sense (but I don't know --- I'm very much not a software engineer).
When dealing with this "intermediate" amount of data to pass to a subprocess, what is the recommended way of doing it?
You can serialize the data (using json or similar) and pass it on the subprocess's stdin
. subprocess.run(input=...)
, for example.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With