I'm having troubles with sqlite inserts performance even with transactions. My android app receives approximately 23.000 rows from a web service and I have to insert them into a single table.
The web service is partitioned so that I receive about 2000 rows in every request to the WS and I wrap the 2000 inserts within a transaction. After those inserts are done I send the new request to the WS and again use a new transaction for the new 2000 rows.
At the beginning it works fine. It does a lot of inserts per second. but with the time it gets slower and finally ends up with 4 or 3 inserts per second until it completes the 23000 rows.
Is there a problem with the size of the table? When it gets bigger the inserts get slower? Is there any way to improve the performance for that large amount of data?
Thanks for your help.
There is an excellent thread covering SQLite performance in the question How Do I Improve The Performance of SQLite? which is quite good. I would go for at least the prepared statements if you are not using them already.
However, as you are on Android, I am guessing that you might be running in to some I/O bottlenecks writing to flash memory. Try running the code on a couple of different devices from different manufacturers too see if there is any extreme variation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With