Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Iterating over doctrine collections takes too much memory

I've a symfony2 command that skims over my big database and exports the data into an XML file.

This operation takes too much memory and I can see my php process when running starting to take 50 MB, then 100Mb .. and after 5 minutes it's 700MB and before it's done it takes ~800MB which is obviously huge.

How do I optimize the amount of memory used by Doctrine?

Below's how my code looks like:

  // Gets 4000 entities
  $entities1 = $this->doctrine->getRepository('MyBundle:Entity1')->findAll();


    foreach ($entities1 as $entity1)
    {
         // 200 entities under every entity1
         foreach ($entity1->getCollection1() as $c)
         {
               // write into an xml
         }
    }

Is there a way to optimize this/to do it better?

like image 942
smarber Avatar asked Feb 16 '15 15:02

smarber


1 Answers

I would suggest to use the doctrine "batch processing" (http://doctrine-orm.readthedocs.org/en/latest/reference/batch-processing.html).

It allows you to handle a lot of data and to limit the php memory used. So in your case that would be something like this:

  $em = $this->doctrine->getManager();
  // Gets 4000 entities
  $queryEntities1 = $em->createQuery('select e from MyBundle:Entity1');
  $iterableEntities1 = $queryEntities1->iterate();

    foreach ($iterableEntities1 as $row)
    {
         $entity1 = $row[0];
         // 200 entities under every entity1
         foreach ($entity1->getCollection1() as $c)
         {
               // write into an xml
         }
         $em->detach($entity1);
    }

Not tested but you may need to add another query for your collection! But this way your memory will be cleaned after every entity (not sure it'll be enough in your case but you can still try to add another iterable request for the collection's foreach).

like image 51
Snroki Avatar answered Sep 29 '22 23:09

Snroki