I have one excel sheet with 7Gb of data & not able to open it directly from ms_excel. My intention is to cut that excel file into smaller files. I tried online tools as well as offline tools. Any suggestions?
Thank you.
This is precisely the kind of job for sxl. It can iterate over a large Excel file without loading it all into memory.
From the project's readme:
Once installed, you can iterate through the entire file without using much memory by doing the following:
from sxl import Workbook
wb = Workbook("filepath")
ws = wb.sheets['sheet name'] # or, for example, wb.sheets[1]
for row in ws.rows:
print(row)
The example simply prints the rows, but you would want to do whatever processing you're going to do. If you need to store the data for later, you have several options, such as writing to several, smaller Excel workbooks; writing to a database; writing to a CSV or other text file; etc.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With