I am trying to get the hard drive size and free space using Python (I am using Python 2.7 with macOS).
I am trying with os.statvfs('/')
, especially with the following code. Is it correct what I am doing? Which definition of the variable giga
shall I use?
import os def get_machine_storage(): result=os.statvfs('/') block_size=result.f_frsize total_blocks=result.f_blocks free_blocks=result.f_bfree # giga=1024*1024*1024 giga=1000*1000*1000 total_size=total_blocks*block_size/giga free_size=free_blocks*block_size/giga print('total_size = %s' % total_size) print('free_size = %s' % free_size) get_machine_storage()
EDIT: statvfs
is deprecated in Python 3, do you know any alternative?
disk_usage() method in Python is to get disk usage statistics about the given path. This method returns a named tuple with attributes total, used and free.
That command is df -H. The -H switch is for human-readable format. The output of df -H will report how much space is used, available, percentage used, and the mount point of every disk attached to your system (Figure 1).
Notice: As a few people mentioned in the comment section, this solution will work for Python 3.3 and above. For Python 2.7 it is best to use the psutil
library, which has a disk_usage
function, containing information about total, used and free disk space:
import psutil hdd = psutil.disk_usage('/') print ("Total: %d GiB" % hdd.total / (2**30)) print ("Used: %d GiB" % hdd.used / (2**30)) print ("Free: %d GiB" % hdd.free / (2**30))
For Python 3.3 and above, you can use the shutil
module, which has a disk_usage
function, returning a named tuple with the amounts of total, used and free space in your hard drive.
You can call the function as below and get all information about your disk's space:
import shutil total, used, free = shutil.disk_usage("/") print("Total: %d GiB" % (total // (2**30))) print("Used: %d GiB" % (used // (2**30))) print("Free: %d GiB" % (free // (2**30)))
Output:
Total: 931 GiB Used: 29 GiB Free: 902 GiB
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With