Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Long db query result with concurrent write at the same time

I'm curious. I have flow like that: We have big collection/table with a lot of data. And have some select oriented query which is long and takes 3 seconds.

However we looking to highly concurrent environment and each seconds we get 100 new records in our database.

So let's say we have a query, and before that query starts we have 1000 items which satisfy those query. Query takes 3 seconds, and each seconds there is 50 new items that match those query added to the db. My questions is - which is result this query return to me (is it still 1000 or 1150 or something in between) and how this depend from different database engine (SQL, NoSQL). O

It's not a question about exact number, but a bit more - why it will be those number.


Looks like question is a bit wide. Let's limit DB with MySQL, Postgres, MongoDB and Cassandra.

like image 499
Ph0en1x Avatar asked Oct 13 '15 15:10

Ph0en1x


1 Answers

Speaking generally (mainly because you didn't name a specific database), the concurrency level of a database is configurable, and falls under the category of performance tuning.

Some common lock granularities are:

  • ROW - only a single row of data is locked at a time
  • PAGE - some group of rows is locked at a time
  • TABLE - the entire table is locked

So, if you used ROW level locking, you would likely get all 1150 results, at the expense of higher locking overhead. Or, if you used TABLE level locking, you would get 1000 results very quickly, but at the expense of your data stream being blocked from writing to your db for 3s.

like image 182
Andy Guibert Avatar answered Oct 18 '22 10:10

Andy Guibert