Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Amazon SQS to funnel database writes

Assume I am building netflix and I want to log each view by the userID and the movie ID

The format would be viewID , userID, timestamp,

However in order to scale this, assume were getting 1000 views a second. Would it make sense to queue these views to SQS and then our queue readers can un-queue one by one and write it to the mysql database. This way the database is not overloaded with write requests.

Does this look like something that would work?

like image 493
Faisal Abid Avatar asked Nov 25 '10 01:11

Faisal Abid


1 Answers

Faisal,

This is a reasonable architecture; however, you should know that writing to SQS is going to be many times slower than writing to something like RabbitMQ (or any local) message queue.

By default, SQS FIFO queues support up to 3,000 messages per second with batching, or up to 300 messages per second (300 send, receive, or delete operations per second) without batching. To request a limit increase, you need to file a support request.

That being said, starting with SQS wouldn't be a bad idea since it is easy to use and debug.

Additionally, you may want to investigate MongoDB for logging...check out the following references:

MongoDB is Fantastic for Logging

http://blog.mongodb.org/post/172254834/mongodb-is-fantastic-for-logging

Capped Collections

http://blog.mongodb.org/post/116405435/capped-collections

Using MongoDB for Real-time Analytics

http://blog.mongodb.org/post/171353301/using-mongodb-for-real-time-analytics

like image 176
Wil Moore III Avatar answered Sep 30 '22 09:09

Wil Moore III