Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to write big set of data to xls file?

Tags:

python

excel

hdf5

I have really big database which I want write to xlsx/xls file. I already tried to use xlwt, but it allows to write only 65536 rows (some of my tables have more than 72k rows). I also found openpyxl, but it works too slow, and use huge amount of memory for big spreadsheets. Are there any other possibilities to write excel files?

edit: Following kennym's advice i used Optimised Reader and Writer. It is less memory consuming now, but still time consuming. Exporting takes more than hour now (for really big tables- up to 10^6 rows). Are there any other possibilities? Maybe it is possible to export whole table from HDF5 database file to excel, instead of doing it row after row- like it is now in my code?

like image 270
Paweł Chomicki Avatar asked Feb 07 '13 14:02

Paweł Chomicki


People also ask

How do I import a large amount of data into Excel?

Go to the Data tab > From Text/CSV > find the file and select Import. In the preview dialog box, select Load To... > PivotTable Report.

Which Excel format is best for large data?

The “default” file extension is XLSX. The large majority of Excel workbooks uses this format these days.


2 Answers

Try and use XlsxWriter in Constant Memory mode.

  • Only for Writing Excel 2007 xlsx/xlsm files
  • It works much faster than Openpyxl
  • Provide Constant memory mode. : http://xlsxwriter.readthedocs.org/working_with_memory.html

For .xls files I fear there's no memory optimized way. Did you find any ?

like image 135
Sahil kalra Avatar answered Oct 07 '22 01:10

Sahil kalra


Use the Optimized Reader and Writer of the openpyxl package. The optimized reader and writer run much faster and use far less memory than the standard openpyxl methods.

like image 45
ben_frankly Avatar answered Oct 06 '22 23:10

ben_frankly