The size in memory of numpy arrays is easy to calculate. It's simply the number of elements times the data size, plus a small constant overhead. For example, if your cube. dtype is int64 , and it has 1,000,000 elements, it will require 1000000 * 64 / 8 = 8,000,000 bytes (8Mb).
With a numpy array we need roughly 8 Byte per float. A linked list however requires roughly 32 Bytes per float.
1. NumPy uses much less memory to store data. The NumPy arrays takes significantly less amount of memory as compared to python lists. It also provides a mechanism of specifying the data types of the contents, which allows further optimisation of the code.
There are two main reasons why we would use NumPy array instead of lists in Python. These reasons are: Less memory usage: The Python NumPy array consumes less memory than lists.
You can use array.nbytes
for numpy arrays, for example:
>>> import numpy as np
>>> from sys import getsizeof
>>> a = [0] * 1024
>>> b = np.array(a)
>>> getsizeof(a)
8264
>>> b.nbytes
8192
The field nbytes will give you the size in bytes of all the elements of the array in a numpy.array
:
size_in_bytes = my_numpy_array.nbytes
Notice that this does not measures "non-element attributes of the array object" so the actual size in bytes can be a few bytes larger than this.
In python notebooks I often want to filter out 'dangling' numpy.ndarray
's, in particular the ones that are stored in _1
, _2
, etc that were never really meant to stay alive.
I use this code to get a listing of all of them and their size.
Not sure if locals()
or globals()
is better here.
import sys
import numpy
from humanize import naturalsize
for size, name in sorted(
(value.nbytes, name)
for name, value in locals().items()
if isinstance(value, numpy.ndarray)):
print("{:>30}: {:>8}".format(name, naturalsize(size)))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With