Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
425 views
in Technique[技术] by (71.8m points)

curl - php - Fastest way to check presence of text in many domains (above 1000)

I have a php script running and using cURL to retrieve the content of webpages on which I would like to check for the presence of some text.

Right now it looks like this:

for( $i = 0; $i < $num_target; $i++ ) {
    $ch = curl_init();
    $timeout = 10;
    curl_setopt ($ch, CURLOPT_URL,$target[$i]);
    curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt ($ch, CURLOPT_FORBID_REUSE, true);
    curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
    $url = curl_exec ($ch);
    curl_close($ch);

    if (preg_match($text,$url,$match)) {
        $match[$i] = $match;
        echo "text" . $text . " found in URL: " . $url . ": " . $match .;

        } else {
        $match[$i] = $match;
        echo "text" . $text . " not found in URL: " . $url . ": no match";
        }
}

I was wondering if I could use a special cURL setup that makes it faster ( I looked in the php manual chose the options that seemed the best to me but I may have neglected some that could increase the speed and performance of the script).

I was then wondering if using cgi, Perl or python (or another solution) could be faster than php.

Thank you in advance for any help / advice / suggestion.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You can use curl_multi_init .... which Allows the processing of multiple cURL handles in parallel.

Example

$url = array();
$url[] = 'http://www.huffingtonpost.com';
$url[] = 'http://www.yahoo.com';
$url[] = 'http://www.google.com';
$url[] = 'http://technet.microsoft.com/en-us/';

$start = microtime(true);
echo "<pre>";
print_r(checkLinks($url, "Azure"));
echo "<h1>", microtime(true) - $start, "</h1>";

Output

Array
(
    [0] => http://technet.microsoft.com/en-us/
)

1.2735739707947 <-- Faster

Function Used

function checkLinks($nodes, $text) {
    $mh = curl_multi_init();
    $curl_array = array();
    foreach ( $nodes as $i => $url ) {
        $curl_array[$i] = curl_init($url);
        curl_setopt($curl_array[$i], CURLOPT_RETURNTRANSFER, true);
        curl_setopt($curl_array[$i], CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729)');
        curl_setopt($curl_array[$i], CURLOPT_CONNECTTIMEOUT, 5);
        curl_setopt($curl_array[$i], CURLOPT_TIMEOUT, 15);
        curl_multi_add_handle($mh, $curl_array[$i]);
    }
    $running = NULL;
    do {
        usleep(10000);
        curl_multi_exec($mh, $running);
    } while ( $running > 0 );
    $res = array();
    foreach ( $nodes as $i => $url ) {
        $curlErrorCode = curl_errno($curl_array[$i]);
        if ($curlErrorCode === 0) {
            $info = curl_getinfo($curl_array[$i]);
            if ($info['http_code'] == 200) {
                if (stripos(curl_multi_getcontent($curl_array[$i]), $text) !== false) {
                    $res[] = $info['url'];
                }
            }
        }
        curl_multi_remove_handle($mh, $curl_array[$i]);
        curl_close($curl_array[$i]);
    }
    curl_multi_close($mh);
    return $res;
}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...