Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using php to write large amount of data to Excel without memory limit error

I have a oracle database where large amount of biometric data like HRV and ECG is stored. I need to show this data for each users in excel sheet. But this data is so large, even for a single user we are getting more than 1,00,000 records. There are currently 100~ users.

What I am doing is :

  1. I execute a CRON job using command line for the same, which I developed in Zend framework.
  2. I make sure that the this CRON does not overlap.
  3. I get all the data from oracle database for each user one by one then I store this in an array.
  4. When I get data for all users then I am using phpexcel library to generate excel sheet.

    Structure of Excel sheet

    • uid 1/ uid/2 uid/3 ----------- nth
    • data |data |data |
    • data |data |data |
    • |
    • nth

Problem :

PHP takes 1.5 GB of RAM and stores the data in array and send it to the functions for interacting with phpexcel but this library takes 3-4 hours and then I get "Fatal : Memory limit" error. My system has only 2 GB RAM.

What steps I should take to optimize my code to handle data of this size and displaying the same information in excel format or I need to increase the RAM?

like image 427
techgyani Avatar asked Dec 22 '12 04:12

techgyani


1 Answers

Consider using PHP only to kick out some kind of Excel-readable delimited file (CSV) containing the result data. (The idea is that you wouldn't be writing to Excel via PHP--because of the limits you are experiencing).

You could request that php generated file via something like Powershell and then have the Powershell or other scripting choice open the csv file straight into Excel and then save as an xlsx.

Or you could have your command line php do a 'system' call and execute the scripting step that opens the csv straight into Excel and then saves it as an xlsx.

like image 98
DWright Avatar answered Nov 03 '22 00:11

DWright