John Davidson

php - Laravel upload a big CSV efficiently

0 comments
Message:


I am trying to upload a CSV file to my database in laravel. But my CSV file is pretty big, I almost have 500 million rows that I want to import. (I am using Maatwebsite to do this)


And when I try to import it I am getting:


Maximum execution time of 300 seconds exceeded

As you can see I already changed the "max_input_time" in the php.init file. 300 seconds would be enough because datagrip takes only 3 minutes. And even if it would take longer in laravel there has to be another way than increasing the "max_input_time"


this is the code that's converting the data in a model and evantually putting it in de database:


public function model(array $row)
{

return new DutchPostalcode([
'postalcode' => $row['PostcodeID'],
'street' => $row['Straat'],
'place' => $row['Plaats'],
'government' => $row['Gemeente'],
'province' => $row['Provincie'],
'latitude' => $row['Latitude'],
'longtitude' => $row['Longitude'],
]);

}

this is my controller:


public function writeDutchPostalCodes(){
Excel::import(new DutchPostalcodes, 'C:\Users\Moeme\Documents\Projects\ahmo apps\Apps\freshness\Freshness - be\FreshnessBE\resources\postalcodes\postcodetabel_1.csv');
}


My Answer:
`max_execution_time` in my php.ini file to a higher value, but it still times out. Is there a more efficient way to upload a big CSV file in Laravel?

One way to efficiently upload a big CSV file in Laravel is to use Laravel's chunking feature. This allows you to process the CSV file in smaller chunks, rather than trying to import all 500 million rows at once.

Here's an example of how you can use chunking with Maatwebsite in Laravel:

php
use Maatwebsite\Excel\Facades\Excel;

Excel::filter('chunk')->load('path/to/your/csv/file.csv')->chunk(1000, function($results) {
foreach ($results as $row) {
// Process each row and insert into database
}
});


In this example, we are loading the CSV file in chunks of 1000 rows at a time and processing each chunk in the callback function. This allows you to import the large CSV file without hitting the maximum execution time limit.

You can adjust the chunk size to fit your needs, depending on the size of your CSV file and the resources available on your server.

Additionally, you can also consider optimizing your database queries and indexing your database tables to improve the import performance. This can help speed up the import process and reduce the chances of hitting the maximum execution time limit.

Rate this post

3 of 5 based on 1795 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us