Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to insert big data on the laravel?

I am using laravel 5.6

My script to insert big data is like this :

...
$insert_data = [];
foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);
    $posting_date = $posting_date->format('Y-m-d');
    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];
    $insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);

I have tried to insert 100 record with the script, it works. It successfully insert data

But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :

504 Gateway Time-out

How can I solve this problem?

like image 635
moses toh Avatar asked Jul 23 '18 22:07

moses toh


People also ask

Can laravel handle big data?

Laravel supports Eloquent, which is one of PHP's robust frameworks for data handling, gives you faster access to the data. Laravel contains lighter templates and widgets such as JS and CSS code that are actively being used to manage Big Data.

Which method is used to insert records in the table in laravel?

We can insert the record using the DB facade with insert method.

How can get data from database in laravel?

After configuring the database, we can retrieve the records using the DB facade with select method. The syntax of select method is as shown in the following table. Run a select statement against the database.


2 Answers

As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:

1 - Reorganise your code to properly use chunks, this will look something like this:

$insert_data = [];

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];

    $insert_data[] = $data;
}

$insert_data = collect($insert_data); // Make a collection to use the chunk method

// it will chunk the dataset in smaller collections containing 500 values each. 
// Play with the value to get best result
$chunks = $insert_data->chunk(500);

foreach ($chunks as $chunk)
{
   \DB::table('items_details')->insert($chunk->toArray());
}

This way your bulk insert will contain less data, and be able to process it in a rather quick way.

2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :

ini_set('max_execution_time', 120 ) ; // time in seconds

$insert_data = [];

foreach ($json['value'] as $value)
{
   ...
}

To read more go to the official docs

like image 55
Vit Kos Avatar answered Sep 19 '22 15:09

Vit Kos


It makes no sense to use an array and then convert it to a collection.

We can get rid of arrays.

$insert_data = collect();

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $insert_data->push([
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ]);
}

foreach ($insert_data->chunk(500) as $chunk)
{
   \DB::table('items_details')->insert($chunk->toArray());
}
like image 29
Anton Vlasenko Avatar answered Sep 17 '22 15:09

Anton Vlasenko