I have the functionality to import a very large amount of records around 5 million.
I have to make entries for related tables also simultaneously in the import process.
I have to make bunch insert queries for new entries and taking care of all queries and also making processes in chunks.
What are the other ways to speed up the process?
(copied from laracasts) This will probably help too:
DB::connection()->disableQueryLog();
"By default, Laravel keeps a log in memory of all queries that have been run for the current request. However, in some cases, such as when inserting a large number of rows, this can cause the application to use excess memory."
So to summarize for people who don't bother looking through all comments separately:
Besides the points already made you could consider:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With