I want to put a bunch of packed integers into a file, e.g.:
for i in int_list:
fp.write(struct.pack('<I', i))
Now I'd like to read them out into int_list. I could do this, but it seems inefficient:
data = fp.read()
int_list = []
for i in xrange(0, len(data), 4):
int_list.append(struct.unpack('<I', data[i:i+4])[0])
Is there a more efficient way to do this?
calcsize() This function calculates the size of the String representation of struct with a given format string.
struct. unpack (format, buffer) Unpack from the buffer buffer (presumably packed by pack(format, ...) ) according to the format string format. The result is a tuple even if it contains exactly one item. The buffer's size in bytes must match the size required by the format, as reflected by calcsize() .
In Python 2, struct. pack() always returned a string type. It is in Python 3 that the function, in certain cases, will return a bytes object. Due to a lack of support of byte objects with this function in Python 2, it considered both bytes and string to be the same when returning.
Python offers several data types that you can use to implement records, structs, and data transfer objects.
You can do it more efficiently in both directions:
>>> import struct
>>> int_list = [0, 1, 258, 32768]
>>> fmt = "<%dI" % len(int_list)
>>> data = struct.pack(fmt, *int_list)
>>> data
'\x00\x00\x00\x00\x01\x00\x00\x00\x02\x01\x00\x00\x00\x80\x00\x00'
>>> # f.write(data)
... # data = f.read()
...
>>> fmt = "<%dI" % (len(data) // 4)
>>> new_list = list(struct.unpack(fmt, data))
>>> new_list
[0, 1, 258, 32768]
array.array
should be fast for this. You can specify the type of elements it contains - there are a few for integers (although IIUC only in machine endianness), and then use its fromfile
method to read directly from a file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With