I am using laravel 5.6
My script to insert big data is like this :
...
$insert_data = [];
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$data = [
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
];
$insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);
I have tried to insert 100 record with the script, it works. It successfully insert data
But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :
504 Gateway Time-out
How can I solve this problem?
Laravel supports Eloquent, which is one of PHP's robust frameworks for data handling, gives you faster access to the data. Laravel contains lighter templates and widgets such as JS and CSS code that are actively being used to manage Big Data.
We can insert the record using the DB facade with insert method.
After configuring the database, we can retrieve the records using the DB facade with select method. The syntax of select method is as shown in the following table. Run a select statement against the database.
As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:
1 - Reorganise your code to properly use chunks, this will look something like this:
$insert_data = [];
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$data = [
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
];
$insert_data[] = $data;
}
$insert_data = collect($insert_data); // Make a collection to use the chunk method
// it will chunk the dataset in smaller collections containing 500 values each.
// Play with the value to get best result
$chunks = $insert_data->chunk(500);
foreach ($chunks as $chunk)
{
\DB::table('items_details')->insert($chunk->toArray());
}
This way your bulk insert will contain less data, and be able to process it in a rather quick way.
2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :
ini_set('max_execution_time', 120 ) ; // time in seconds
$insert_data = [];
foreach ($json['value'] as $value)
{
...
}
To read more go to the official docs
It makes no sense to use an array and then convert it to a collection.
We can get rid of arrays.
$insert_data = collect();
foreach ($json['value'] as $value) {
$posting_date = Carbon::parse($value['Posting_Date']);
$posting_date = $posting_date->format('Y-m-d');
$insert_data->push([
'item_no' => $value['Item_No'],
'entry_no' => $value['Entry_No'],
'document_no' => $value['Document_No'],
'posting_date' => $posting_date,
....
]);
}
foreach ($insert_data->chunk(500) as $chunk)
{
\DB::table('items_details')->insert($chunk->toArray());
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With