John Davidson

php - I need to parse huge xml array with min 300 elements , but it slow downs , i need advice

0 comments
Message:


I am trying with this code to parse xml elements of an array. With small amount of elements it works fine,
but when elements are more than 50 , its appears a problem. it slows down very quiclky and even the process get stack.


$url=array('{250 domain names}')
foreach($url as $url){
$xml = simplexml_load_file('http://data.alexa.com/data?cli=10&dat=snbamz&url='.$url);
$rank=isset($xml->SD[1]->POPULARITY)?$xml->SD[1]->POPULARITY->attributes()->TEXT:0;
$web=(string)$xml->SD[0]->attributes()->HOST;
echo "$web ---> $rank<br>";
}

i need at least 250 domains to put on php array , but output gets stack.
is there some other method of codding so the output appears with no slows?
thanx in advance.



My Answer:
One way to improve the performance of your code is to use multi-threading or parallel processing. This way, you can fetch the data for multiple URLs simultaneously, instead of waiting for each request to finish before moving on to the next one.

You can achieve this using PHP libraries like Guzzle or cURL, which allow you to make asynchronous HTTP requests. Here's an example using Guzzle:

php
require 'vendor/autoload.php'; // Make sure to include Guzzle library

use GuzzleHttp\Client;
use GuzzleHttp\Promise;

$urls = array('{250 domain names}');
$client = new Client();

$promises = [];
foreach ($urls as $url) {
$promises[$url] = $client->getAsync('http://data.alexa.com/data?cli=10&dat=snbamz&url=' . $url);
}

$results = Promise\settle($promises)->wait();

foreach ($results as $url => $result) {
$response = $result['value'];
$xml = simplexml_load_string($response->getBody());
$rank = isset($xml->SD[1]->POPULARITY) ? $xml->SD[1]->POPULARITY->attributes()->TEXT : 0;
$web = (string) $xml->SD[0]->attributes()->HOST;
echo "$web ---> $rank
";
}


This code will send asynchronous requests for each URL in the array, and then wait for all the responses to be received before processing them. This should help speed up the process and prevent it from getting stuck.

Additionally, you can consider caching the results of the API calls to avoid making the same requests multiple times. You can use a caching mechanism like Redis or Memcached to store the results and retrieve them quickly when needed.

I hope this helps! Let me know if you have any other questions.

Rate this post

3 of 5 based on 8854 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us