2
$query = "https://api.facebook.com/method/fql.query?format=JSON&query=";
$urls = array('about 500 link');
foreach ($urls as $url)
{
    $query .= urlencode("select post_fbid, fromid, object_id, text, time from comment where object_id in (select comments_fbid from link_stat where url ='$url')"); 
    $query .= "&pretty=1";

    $ch = curl_init();
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch, CURLOPT_URL, $query);
    curl_setopt($ch, CURLOPT_TIMEOUT ,900); 
    $data = curl_exec($ch); 
    $response = json_decode($data, true);
    echo '<pre>';print_r($response);echo '</pre>';
    curl_close($ch);
}

当我在数组中有大约 500 个链接时运行此代码时$urls,我得到一个错误结果:ERROR 500 - Internal Server Error,但$urls只包含 5 个链接,没有错误。如何在不超时的情况下获取所有 500 个链接的内容?

4

1 回答 1

2

It's not curl error, it's your server error, because there is a limit to request duration. Requesting 500 pages takes awhile, and your script terminates. Try the following:

  1. Set set_time_limit(200); or more
  2. Run the script by using SSH connection, there's no timeout at all
  3. Use curl_multi_init to do multiple request in parallel
于 2012-12-14T18:30:28.727 回答