3

Im having a feed that is password protected. Below is the code used to access the feed

$url = 'http://thefeedwebsite.com/feed.php';

$data = array("username" => ‘user’, "password" => ‘password’, "location" => "HK")
$ch = curl_init($url);

);

curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);


$output = curl_exec($ch);

curl_close($ch);

The problem is that due to large size after outputting about 100 results it keeps on timeout. I have set the time limit in my php.ini as some threads suggested but still the same issue. I think its because CURL loads the complete feed to the memory.

Is it possible to load the $output directly to XMLReader() in php so I can process the feed through the reader faster?

Sorry is the question is totally noob. Just started learning php with xml

4

2 回答 2

2

该线程可以帮助您(流式 cURL 和玩内存):

操作一个 3000 万个字符长的字符串

第一个答案将其存储在文件中。第二个是流数据“随着它们流动”。如果文件真的很大,您应该考虑要使用的 XML 解析器。有些将整个 xml 加载到内存中并创建一个对象,但其他一些可以只提供您可以在运行中使用 XML 的接口方法(无需将整个 XML 加载到内存中)。

于 2013-03-20T18:41:55.820 回答
0

If time limit (http://php.net/manual/en/function.set-time-limit.php) is not your issue, have you considered that you could be running out of memory?

http://www.php.net/manual/en/ini.core.php#ini.memory-limit

于 2013-03-20T18:34:41.717 回答