John Davidson

php - Uneffective loop in Laravel 8, time spent on each iteration is proportional to array size

0 comments
Message:


In Laravel 8 / PHP 8, I receive a body in POST, transform the data, and bulk insert into TimescaleDB.


Here is the code:


public function store(Request $request)
{
ini_set('max_execution_time', -1);
$body = $request->getContent();
$meter = json_decode($body, true);
$chunkSize = 1000;
$measures = $meter['Measures'];
$chunkedRaws = collect($measures)->chunk($chunkSize);
$ok = 0;
$nok = 0;
$initialTime = Carbon::now();
foreach ($chunkedRaws as $chunkNumber => $raws) {
$chunkTime = Carbon::now();
$rawsToInsert = [];
foreach ($raws as $k => $raw) {
$rawTime = Carbon::now();
array_push($rawsToInsert, new \App\Models\Raw([
'operation_id' => $meter['OperationID'],
'meter_id' => $meter['ID'],
'conso_prod' => $meter['ConsoProd'],
'timestep' => $meter['Timestep'],
'unit' => $meter['Unit'],
'source' => $source,
'time' => Carbon::parse($raw['t'])->toIso8601ZuluString(),
'delta' => $raw['d'],
]));


}
// Insert into TimescaleDB cut
}

return response()->json(['message' => $ok . " Raws has been inserted,\n " . $nok . " Failed"], 200);
}

The first time I call it, I have a body with 3744 elements.


It will take aprox 5-6ms for each element to push into the array so, 6 x 3744 / 1000 = 22 sec aprox.


But my second post has 26930 elements. Now, each array_push will take aprox 52 ms, so 52 * 26930 / 1000 = 23 minutes aprox. Way to long.


the array_push should always be the same independently of how many rows there is in body ?


Why is the time spent on array_push proportional to the $chunkedRaws size ???



My Answer:
> This code snippet doesn't show any loop, so it's hard to determine where the inefficiency might be. However, if you are experiencing slow performance when inserting data into TimescaleDB, it could be due to the way you are processing and inserting the data.

Here are a few suggestions to improve the performance of your code:

1. Use Laravel's Eloquent ORM for bulk inserts: Instead of manually looping through the data and inserting each record individually, you can use Laravel's Eloquent ORM to perform bulk inserts. This can significantly improve the performance of your code.

2. Use batch processing: If you have a large amount of data to insert, consider breaking it up into smaller batches and inserting them in chunks. This can help reduce the time spent on each iteration.

3. Optimize your database schema: Make sure that your database schema is optimized for the type of data you are inserting. Indexes, constraints, and proper data types can all impact the performance of your database operations.

4. Use database transactions: Wrap your insert operations in a database transaction to ensure data consistency and improve performance.

5. Consider using asynchronous processing: If the data processing and insertion are taking too long, consider offloading the work to a background job or queue to free up resources and improve performance.

By implementing these suggestions, you should be able to improve the performance of your code and reduce the time spent on each iteration.

Rate this post

5 of 5 based on 8932 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us