Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using redis as intermediary before MySQL insert [closed]

We're writing a chat application, partially in node.js. We're expecting 1000s of inserts (messages) per second, and so our approach has been to write to redis (using an lrange for each room), and also have a list in redis called not_saved_messages that another process loops through and writes to MySQL. Is this an anti pattern? Should we just be writing to MySQL and hope it holds up?

like image 286
StackOverflowed Avatar asked Oct 16 '25 14:10

StackOverflowed


2 Answers

I don't think it is necessarily an antipattern if it is well done. The devil of course is in the details.

What you are trying to do is use mysql for logging and redis for live information, right? It seems to me that might work. The question is how to make the pipeline as a whole function.

BTW, a number of big data users use something very close to this result. CERN for example uses a number of NoSQL solutions during their data processing before the data goes into an rdbms (Oracle, PostgreSQL, or MySQL). But it is definitely advanced design territory.

like image 164
Chris Travers Avatar answered Oct 18 '25 07:10

Chris Travers


Instead of inserting the data directly, you can first store it in Redis. Then second process can pick it up from Redis and insert it into the database, one "data portion" at a time. Drawback here is you will need to have enought memory for Redis.

// PHP program code
//...
// assuming $r is already connected Redis() instance.
$r->sadd("data_pipe", serialize($data));

Then,

// Open connection to the database
open_data_connection();

$r = new Redis();
$r->connect("127.0.0.1", "6379");

while(true){
    while($xdata = $r->spop("data_pipe")){
        $data = unserialize($xdata);

        // store the data
        store_data($data);
    }

    // No more data
    sleep(30);
}

If database can not keep up, e.g. insert data as fast as new data come in, you can always start the pick up process twice or more, or to do some kind of sharding, by importing the data in 2-3 databases.

http://redis4you.com/code.php?id=016

like image 21
Won Jun Bae Avatar answered Oct 18 '25 06:10

Won Jun Bae



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!