I need to insert a large set of records in to a database table. I am getting the data from a CSV file. I thought about using a multi-insert statement for good performance. So I did something similar to the following:
foreach($data as $key => $value) {
$insert[] = [
'id' => $value->id, 'is_published' => $value->is_published,
"first_name" => $value->first_name, "middle_name" => $value->middle_name, "last_name" => $value->last_name,
"description" => $value->description,
"created_at" => date("Y-m-d H:i:s",strtotime($value->created_at)),
"updated_at" => date("Y-m-d H:i:s",strtotime($value->changed_at))
];
}
if(!empty($insert)){
Model::insert($insert);
}
But in doing so I always see the error:
mysql server gone away error.
I have debugged it by printing the query and I observed that by doing so, Laravel doesn't take care of single quotes or double quotes in the description value, as it automatically does while usiung $model::save();
. The printed query version breaks if the description value has single quotes '
or double quotes "
in it.
But if I do some thing like
foreach($data as $key=>$value){
$model = new Model();
$model->id = $value->id;
$model->description = $value->description;
blah blah bla
$model->save();
}
It runs successfully without generating any error. Can anybody let me know how to correct this problem?
Its due to large set of data nothing wrong with quotes problem. When you are trying to insert all rows at once using multi-insert, the dataset becomes too heavy. Either you need to increase the max_allowed_packet or you need to do in chunks. You can use chunk for that. suppose you want to insert 100 elements at a time. you can do something like that.
if(!empty($insert))
{
$collection = collect($insert)->chunk(100)->toArray();
foreach ($collection as $insert)
{
// It will insert 100 items at a time
Model::insert($insert);
}
}
Hope this will help :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With