Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Mysql client ran out of memory

Tags:

mysql

When i try to combine 3 tables having 50K records and write a MySQL select query:

select t1.c1,t2.c2 from table1 t1,table2 t2,table3 t3 
where t3.column3='<value>' and    t1.column1=t2.column1 
      and t2.column2=t3.column2 
      and t2.column2='<value1>' or t2.column2='<value2>' 

This is the kind of the query which am writing to run

I get "mysql client ran out of memory"

Any help on how to overcome this will be highly appreciated.

Thanks

like image 645
Sharpeye500 Avatar asked Jul 26 '10 17:07

Sharpeye500


3 Answers

What client do you use? For mysql you can try running it with --quick option

like image 148
a1ex07 Avatar answered Oct 19 '22 04:10

a1ex07


Here's what I wrote to get around the memory limit. You'll have to modify the limits to match your environment. I'm breaking my results into a 500 per record batch and processing that and then looping into the next $records_per_batch.

<?php
// variables
$counter = 0;
$records_per_batch = 500;
$total_records = 0;
$app_root = "/var/run/consumers";

// mark the start time
system("/bin/touch $app_root/clean.last_start");
$link = mysqli_connect("localhost", "username", "password", "database") or die(mysqli_error());

// get the total amount of records to process
$sql = "SELECT COUNT(*) from table";
$result = mysqli_query($link, $sql) or die("couldn't execute sql: $sql" . mysql_error());

if (mysqli_num_rows($result) > 0) {
    $row = mysqli_fetch_array($result);
    $total_records = $row[0];
}

// iterate through the records at $records_per_batch at a time
$sql_template = "SELECT * FROM table order by table_id limit %s, %s";

while ($counter < $total_records) {
    $sql = sprintf($sql_template,$counter,$records_per_batch);
    $result = mysqli_query($link, $sql) or die("couldn't execute sql: $sql" . mysql_error());

    if ($result) {
        if (mysqli_num_rows($result) > 0) {
            while ($row = mysqli_fetch_array($result)) {
                // do your work here.                
            }
        }
    } else {
        print "hmm, no result\n";
        exit;
    }
    $counter += $records_per_batch;
}

mysqli_close($link);
system("/bin/touch $app_root/clean.last_end");
?>
like image 41
jbrahy Avatar answered Oct 19 '22 04:10

jbrahy


I know this post is old but I think it's worth mentioning that the or is probably causing an issue as well.

Here it is with some joins, and specifying the or for the t2 value only, which is what I think you want. This should limit your result set.

select t1.c1,t2.c2 
from table1 t1
inner join table2 t2 on t2.column1=t1.column1
inner join table3 t3 pm t2.column2=t3.column2 
where t3.column3='<value>' 
and t2.column2 IN('<value1>','<value2>');

50K records is not too large to give you the type of problem described. You can try optimizing your tables by indexing on columns that join or limit such as t2.column1, t2.column2, etc.

like image 42
will Avatar answered Oct 19 '22 05:10

will