John Davidson

curl - What type of configration is required to call run multirequest in PHP?

0 comments
Message:


I want to run more than 2500+ call on same time. So i have created a batch of 100 (2500/100 = 25 total call).


    // REQUEST_BATCH_LIMIT = 100

$insert_chunks = array_chunk(['array', 'i want', 'to', 'insert'], REQUEST_BATCH_LIMIT);
$mh = $running = $ch = [];

foreach ($insert_chunks as $chunk_key => $insert_chunk) {

$mh[$chunk_key] = curl_multi_init();
$ch[$chunk_key] = [];
foreach ($insert_chunk as $ch_key => $_POST) {

$ch[$chunk_key][$ch_key] = curl_init('[Dynamic path of API]');
curl_setopt($ch[$chunk_key][$ch_key], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}

do {
curl_multi_exec($mh[$chunk_key], $running[$chunk_key]);
curl_multi_select($mh[$chunk_key]);
} while ($running[$chunk_key] > 0);

foreach(array_keys($ch[$chunk_key]) as $ch_key) {
$response = curl_getinfo($ch[$chunk_key][$ch_key]);
$returned_data = curl_multi_getcontent($ch[$chunk_key][$ch_key]);
curl_multi_remove_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
curl_multi_close($mh[$chunk_key]);
}

When i running this in local the system is hanged totally.


But this limit of batch like 100, 500 are not same on different device and server, so what is the reason about it? and what changes should i do to increase it?


If i am adding 1000 data with batch of 50, so for every batch 50 records should insert, but it insert randomly for a batch like 40, 42, 48, etc. so way this is skipped calls? (If i am using single record with simple cURL using loop then it is working fine.)


P.S. This code is i am using for bigcommrece API.



My Answer:
The reason your system is hanging when running the code is likely due to the high number of simultaneous requests being made using curl_multi_exec. This can overwhelm your system and cause it to hang.

To increase the batch size and optimize the code, you can try the following:

1. Increase the batch size gradually and monitor the performance of your system. You can start with a smaller batch size and gradually increase it to find the optimal batch size that your system can handle without hanging.

2. You can try using a sleep function in between batches to give your system some time to process the requests. This can help prevent overwhelming your system with too many simultaneous requests.

3. You can also try optimizing your code by using asynchronous requests or using a library like Guzzle that provides better support for handling multiple requests simultaneously.

Regarding the issue of skipped calls when inserting 1000 data with a batch of 50, the reason for this could be due to the way the requests are being processed asynchronously. Since the requests are being processed simultaneously, there is no guarantee of the order in which the responses will be received. This can result in some requests being processed faster than others, leading to skipped calls.

To ensure that all calls are processed in order, you may need to implement a mechanism to track the responses and ensure that all requests are processed before moving on to the next batch. This can be achieved by using callbacks or promises to handle the responses in the order they were sent.

Overall, optimizing the batch size and implementing proper handling of responses can help improve the performance and reliability of your code when making multiple requests using curl_multi_exec.

Rate this post

3 of 5 based on 9528 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us