我正在使用 Spring Boot 创建一个 REST-Api,它需要将大量 JSON 数据传输到客户端。为了避免极端的内存使用或过多的数组实例化,我使用 StreamingResponseBody 以块的形式发送数据:
@Controller
class FooController{
@Autowired
FooService fooService;
//...
private ResponseEntity<StreamingResponseBody> getPropertyVariants(@PathVariable(required = false) String propertyName, @RequestParam(required = false) String instruction) throws JsonProcessingException
{
StreamingResponseBody streamingResponseBody = out -> {
if (propertyName == null) fooService.writeReportToOutStream(out);
else (fooService.writeReportToOutStream(propertyName, out);
};
return ResponseEntity.ok().contentType(MediaType.APPLICATION_JSON).body(streamingResponseBody);
}
}
FooService 有大量数据,它过滤然后使用 JsonGenerator 写入 Stream。我不能在这里展示实际的服务。请放心,生成的 Json-Array 逐个条目写入流中,所有内容都已正确刷新,并且 JsonGenerator 已关闭。如果我的输出数组包含大约 100000 个条目,那么一切正常。但是,如果我将该输出增加到 1000000 个条目,则请求在传输过程中失败。
Stacktrace 的部分内容:
org.apache.coyote.CloseNowException: Failed write
at org.apache.coyote.http11.Http11OutputBuffer$SocketOutputBuffer.doWrite(Http11OutputBuffer.java:548)
at org.apache.coyote.http11.filters.ChunkedOutputFilter.doWrite(ChunkedOutputFilter.java:110)
at org.apache.coyote.http11.Http11OutputBuffer.doWrite(Http11OutputBuffer.java:193)
at org.apache.coyote.Response.doWrite(Response.java:606)
at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:340)
at org.apache.catalina.connector.OutputBuffer.flushByteBuffer(OutputBuffer.java:783)
at org.apache.catalina.connector.OutputBuffer.doFlush(OutputBuffer.java:299)
at org.apache.catalina.connector.OutputBuffer.flush(OutputBuffer.java:273)
at org.apache.catalina.connector.CoyoteOutputStream.flush(CoyoteOutputStream.java:118)
at com.fasterxml.jackson.core.json.UTF8JsonGenerator.flush(UTF8JsonGenerator.java:1178)
at com.fasterxml.jackson.databind.ObjectMapper.writeValue(ObjectMapper.java:3060)
at com.fasterxml.jackson.core.base.GeneratorBase.writeObject(GeneratorBase.java:388)
at de.jmzb.ecomwdc.service.properties.FooService.lambda$null$2(FooService.java:37)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485)
at de.jmzb.ecomwdc.service.properties.FooService.lambda$filter$10(KeyFilterStrategy.java:34)
at java.util.ArrayList.forEach(ArrayList.java:1259)
at de.jmzb.ecomwdc.service.properties.PropertyReportService.writeReportToOutStream(FooService.java:56)
at de.jmzb.ecomwdc.controller.WDCDataSourceController.lambda$getPropertyVariants$1(FooController.java:77)
at org.springframework.web.servlet.mvc.method.annotation.StreamingResponseBodyReturnValueHandler$StreamingResponseBodyTask.call(StreamingResponseBodyReturnValueHandler.java:111)
at org.springframework.web.servlet.mvc.method.annotation.StreamingResponseBodyReturnValueHandler$StreamingResponseBodyTask.call(StreamingResponseBodyReturnValueHandler.java:98)
at org.springframework.web.context.request.async.WebAsyncManager.lambda$startCallableProcessing$4(WebAsyncManager.java:337)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
httpie 报错:
HTTP/1.1 200
Connection: keep-alive
Content-Encoding: gzip
Content-Type: application/json
Date: Thu, 27 May 2021 15:21:46 GMT
Keep-Alive: timeout=60
Transfer-Encoding: chunked
Vary: origin,access-control-request-method,access-control-request-headers,accept-encoding
http: error: ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))
一旦我的服务输入不再需要存储在内存中,我希望能够处理任何大小的数据。显然 OutputStream 在某些时候不再工作? 想法1:由于某种原因,传输的某些块的长度错误,客户端取消请求,OutputStream 关闭。因此,服务器出现异常。 想法 2:由于某种原因,服务器停止了 OutputStream,导致客户端内的响应意外结束。
关于如何解决这个问题的任何想法?很抱歉我不能分享我的原始代码。
谢谢你的帮助!