Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Retrieve all rows from table in doctrine

I have table with 100 000+ rows, and I want to select all of it in doctrine and to do some actions with each row, in symfony2 with doctrine I try to do with this query:

    $query = $this->getDefaultEntityManager()
        ->getRepository('AppBundle:Contractor')
        ->createQueryBuilder('c')
        ->getQuery()->iterate();

    foreach ($query as $contractor) {
        // doing something
    }

but then I get memory leak, because I think It wrote all data in memory.

I have more experience in ADOdb, in that library when I do so:

$result = $ADOdbObject->Execute('SELECT * FROM contractors');
   while ($arrRow = $result->fetchRow()) {
        // do some action
   }

I do not get any memory leak.

So how to select all data from the table and do not get memory leak with doctrine in symfony2 ?

Question EDIT

When I try to delete foreach and just do iterate, I also get memory leak:

$query = $this->getDefaultEntityManager()
            ->getRepository('AppBundle:Contractor')
            ->createQueryBuilder('c')
            ->getQuery()->iterate();
like image 897
Donatas Veikutis Avatar asked Oct 21 '15 06:10

Donatas Veikutis


1 Answers

The normal approach is to use iterate().

$q = $this->getDefaultEntityManager()->createQuery('select u from AppBundle:Contractor c');
$iterableResult = $q->iterate();
foreach ($iterableResult as $row) {
    // do something
}

However, as the doctrine documentation says this can still result in errors.

Results may be fully buffered by the database client/ connection allocating additional memory not visible to the PHP process. For large sets this may easily kill the process for no apparant reason.

The easiest approach to this would be to simply create smaller queries with offsets and limits.

//get the count of the whole query first
$qb = $this->getDefaultEntityManager();
$qb->select('COUNT(u)')->from('AppBundle:Contractor', 'c');
$count = $qb->getQuery()->getSingleScalarResult();

//lets say we go in steps of 1000 to have no memory leak
$limit = 1000;
$offset = 0;

//loop every 1000 > create a query > loop the result > repeat
while ($offset < $count){
    $qb->select('u')
        ->from('AppBundle:Contractor', 'c')
        ->setMaxResults($limit)
        ->setFirstResult($offset);
    $result = $qb->getQuery()->getResult();
    foreach ($result as $contractor) {
        // do something
    }
    $offset += $limit;
}

With this heavy datasets this will most likely go over the maximum execution time, which is 30 seconds by default. So make sure to manually change set_time_limit in your php.ini. If you just want to update all datasets with a known pattern, you should consider writing one big update query instead of looping and editing the result in PHP.

like image 186
oshell Avatar answered Nov 08 '22 08:11

oshell