Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Understanding Mongo Timeouts on Save through Lock Percentage

Tags:

php

mongodb

I'm trying to understand how to better identify where the issue is with what i'm currently seeing.

Presently I am updating a collection through a cron, downloading information from a 3rd party vendor every 15 minutes (without issues). There are times when I need to do a 2 year refresh and is when I see this issue.

Incoming are about 300-600k results all which I'm using mongo->collection->save($item); on I have the _id for all results so that is being hit too for (what I thought) were quick inserts.

The document sizes aren't changing much and are rather small to begin with (12kb~).

I batch the downloads at about 200 per request to 3rd party server, format them, then insert them one at a time into mongo using save with safe insert set to true.

Right now when saves are happening it looks to up my lock percent to between 20-30%. I'm wondering how to track down why this is happening as I believe it's the reason that I end up hitting a timeout (which is set to 100 seconds).

  • Timeout Error: MongoCursorTimeoutException Object->cursor timed out (timeout: 100000, time left: 0:0, status: 0)

  • Mongo Driver: Mongo Native Driver 1.2.6 (from PHP.net)

I'm currently on Mongo 2.2.1 with SSD drives and 16gb of ram.

Here is an example of the mongoStat operation that I follow while inserts are happening:

  insert  query update delete getmore command flushes mapped  vsize    res        faults                locked db               idx miss %     qr|qw   ar|aw  netIn  netOut  conn  set          repl    time 
  0       0     201    0      215     203     0       156g    313g     1.57g      7                    mydb:36.3%               0               0|0     0|0   892k   918k    52    a-cluster    PRI     10:04:36

I have a primary with a secondary setup and an arb fronting them (per documentation suggestions), using PHP to do my inserts.

any help would be GREATLY APPRECIATED.

Thank you so much for your time

Update

I store all items in a "MongoDoc" as there are times that formatting on each of the elements is needed, upon batching these items in there I get the data out and insert as

$mongoData = $mongoSpec->getData();
try {
    foreach($mongoData as $insert) {
        $this->collection_instance->save($insert);
        $count++;
    }
} catch(Exception $e) {
    print_r($e->getTrace());
    exit;
}

I will say that I have removed safe writes and I've seen a drastic reduction in timeout's occurring, so as for now I'm chalking it up to that (unless there is something wrong with the insert..)

Thank you for your time and thoughts.

like image 452
Petrogad Avatar asked Mar 01 '13 16:03

Petrogad


1 Answers

You're hitting the PHP max execution limit? Which Mongo library are you using? I was using FuelPHP's MongoDb library, and it would take nearly 1 second for only ~50 inserts (because each write was a confirmed, fsync'd operation), so this doesn't surprise me. My solution was to fsync and write confirm only at certain intervals, which gives much better performance, with reasonable assurance that nothing went wrong.

More info:
http://docs.mongodb.org/manual/reference/command/fsync/
http://docs.mongodb.org/manual/core/write-operations/#write-concern

like image 146
landons Avatar answered Oct 30 '22 09:10

landons