Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keeping Data in memory

I have a http handler and I m storing every request to a concurrent queue collection in the memory. after a certain time, i m bulk inserting the collection to a database.

is this a bad idea? Because there is high volume, this seems to be a better approach IMO.

I do see some discrepancies (Number of hits vs number of stored elements in the db), due to threading, while i m flushing the concurrent collection, i m locking it and bulk insert its content and then empty the collection. then remove the lock from collection.

is there a better practice? or have you done a similar thing?

like image 504
DarthVader Avatar asked Oct 10 '22 04:10

DarthVader


1 Answers

Sorry but I would say that it is a bad idea. There are the following problems:

  • If the application pool recycles before data is written to the database you will loose data
  • Keeping all data in the same collection leads to the need to lock that collection when data is inserted and when the data is written to disk and the collection cleared. This could cause the whole site to pause during the bulk insert.
  • Your code will be more complicated with the extra step. Fixing threading problems is hard

We have written web applications that write 1000 rows per second to an SQL Server database at peak load.

Try writting your application as simple as possible first and then performance test it.

The speed at which you can insert into the database depends alot on your hardware, but there are also things that you can do in your program:

  • only have one index (clustered) on the table. Key autonumber.
  • make sure that you release the connection to the database as soon as possible.
like image 165
Shiraz Bhaiji Avatar answered Oct 14 '22 02:10

Shiraz Bhaiji