3

来自 RWH http://book.realworldhaskell.org/read/extended-example-web-client-programming.html

这里使用的 HTTP 库不会懒惰地读取 HTTP 结果。因此,在下载播客等大文件时,可能会导致消耗大量 RAM。其他没有此限制的库可用。我们使用这个是因为它稳定、易于安装且相当易于使用。我们建议使用来自 Hackage 的 mini-http,以满足严重的 HTTP 需求。

mini-http 在 hackage 上已被弃用。问题很简单:你知道任何提供和 api 的包,用于执行 http 请求并使用响应主体而不将其完全加载到内存中。

我想要的是一个 api,它提供了一个可以通过迭代来转换的流。一个简单的例子是计算响应中的字节数。

也许是一个基于迭代的api?

4

2 回答 2

2

In general there is common problem related with parsing something lazily with validation. When you receives HTTP response which contains "Content-Length" header you have to check that you'll read all that data before connection will be closed. That means that you can't say that response is valid until you'll read till the very end. And your mapping will have to wait and then process the whole result.
To avoid that your library may be less strict and check only header correctness and probably first part of data (in case chunked or compressed) and return body with length less or equal to "Content-Length". Or you may use your own chunk-stream which returns Success or Fail as the last chunk.
Another approach is to sacrifice your CPU for processing response as you read it (ex. inside monad) and when there is no valid data for next read you abort all your previous calculation.

I'd suggest to look at http-monad also. Never used it, but I hope that with monad interface it implements that last approach.

于 2010-06-21T04:44:25.853 回答
2

您希望客户端以流的形式下载文件吗?download-curl 的惰性界面怎么样?

可能适合您的需求(或稍作调整)。

于 2010-06-20T20:01:40.230 回答