0

我正在尝试追踪下面显示的 Java 代码中的内存泄漏。

  • OutOfMemoryError: heap space此代码在仅处理少量文件更新后以 2 GB 的最大堆大小运行时抛出。
  • S3 上文件的压缩大小不超过几百 kB。
  • 输入字符串(“修正”变量)不大于 1 MB。
  • 堆转储显示byte[]大小为数百 MB - 在一种情况下,它占用了超过 50% 的堆空间。(然而,工具并没有指出这个数组的创建位置)。

它使用 AmazonS3 客户端实例在远程 S3 存储桶中读取和写入对象,还使用 ​​Jackson 的 ObjectMapper 处理字符串并将它们转换为 JSON 等。java.util.zip包中的 Deflater 和 Inflater 类用于压缩/解压缩。

这段代码是否存在明显的内存泄漏?是否有一个明显的错误可能导致堆上不断增长/巨大的字节数组?我一直在为这个问题挠头有一段时间了!谢谢。

class FileManager {
    // Constructor omitted
    private static final int BYTE_SIZE = 1024;
    private final String bucketName;
    private final AmazonS3 client;
    private final ObjectMapper objectMapper;

    public void saveFile(String key, String amendment, String updateTimeStamp) throws Exception {
        S3Object object = null;
        try {
            JsonNode amendmentJson = objectMapper.readTree(amendment);
            ObjectNode targetJson;
            if (client.doesObjectExist(bucketName, key)) {
                object = client.getObject(bucketName, key);
                InputStream objectStream = object.getObjectContent();
                targetFile = (ObjectNode) decompress(objectStream);
                objectStream.close();
                object.close();
            } else {
                targetJson = objectMapper.createObjectNode();
            }
            targetJson.set(updateTimeStamp, amendmentJson);
            byte[] contentBytes = compress(objectMapper.writeValueAsBytes(targetJson));
            try (ByteArrayInputStream contentStream = new ByteArrayInputStream(contentBytes)) {
                client.putObject(bucketName, key, contentStream);
            }
        } catch (Exception ex) {
          // Logs and rethrows ...
        }
    }

    public byte[] compress(byte[] data) throws IOException {
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream(data.length);
        Deflater deflater = new Deflater();
        byte[] buffer = new byte[BYTE_SIZE];
        deflater.setInput(data);
        deflater.finish();
        while (!deflater.finished()) {
            int count = deflater.deflate(buffer);
            outputStream.write(buffer, 0, count);
        }
        outputStream.close();
        deflater.end();
        return outputStream.toByteArray();
    }

    public JsonNode decompress(InputStream objectReader) throws IOException {
        byte[] data = IOUtils.toByteArray(objectReader);
        byte[] buffer = new byte[BYTE_SIZE];
        Inflater inflater = new Inflater();
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream(data.length);
        String outputString;
        inflater.setInput(data);
        try {
            while (!inflater.finished()) {
                int count = inflater.inflate(buffer);
                outputStream.write(buffer, 0, count);
            }
            outputString = new String(outputStream.toByteArray());
        } catch (DataFormatException e) {
            log.info("Content not zipped");
            outputString = new String(data);
        }
        outputStream.close();
        inflater.end();
        StringReader reader = new StringReader(outputString);
        JsonNode jsonNode = objectMapper.readTree(reader);
        reader.close();
        return jsonNode;
    }

}
4

0 回答 0