0

在 JSON 文件中,大约有 100000 条记录。我正在尝试将它们全部写入 mantle.product.Product 实体。

该过程开始并在大约 35000 条记录处开始恶化,并发出警告“缓慢命中 AT_ENTITY:create:mantle.product.Product ”。然后它肯定会因“ java.lang.OutOfMemoryError: GC overhead limit exceeded ”错误而停止。这种行为在我的电脑上。

欢迎任何提示。

这是代码:

void processJson2(String filePath) {
    //def json = new JsonSlurper().parseText(new BufferedReader(new InputStreamReader(this.getFileIO().openStream(), "UTF-8")))

    //will initialize class manually
    def docReadReference = this.executionContext.resource.getLocationReference(filePath)

    if (docReadReference.isFile()) {
        //inputstream
        InputStream inputFile = docReadReference.openStream()
        TransactionFacade trxFacade = this.executionContext.getTransaction()

        this.executionContext.artifactExecution.disableTarpit()
        this.executionContext.artifactExecution.disableEntityEca()
        this.executionContext.artifactExecution.disableAuthz()

        trxFacade.runRequireNew(50000, "Error loading entity JSON data", {

            try {
                logMachine.info("Opening file ${docReadReference.isFile()}")

                JsonSlurper slurper = new JsonSlurper().setType(JsonParserType.CHARACTER_SOURCE)
                def json = slurper.parse(new BufferedReader(new InputStreamReader(inputFile, "UTF-8")))

                //writer
                Long counter = 1

                json.each {
                    this.executionContext.service.sync().name("create", "mantle.product.Product").parameters([productId: it.sourceFileReference]).call()

                    //display thousands
                    if (counter % 1000 == 0) {
                        logMachine.info("JSON rows processed ${counter} > ${it.sourceFileReference}")
                    }

                    //move counter
                    counter += 1
                }

                //log
                logMachine.info("File processed.")

            } catch (Throwable t) {
                trxFacade.rollback("Error while processing JSON", t);

                //log as warning
                logMachine.warn("Incorrectly handled JSON parsing ${t.message}.")
            } finally {
                if (trxFacade.isTransactionInPlace()) trxFacade.commit();

                inputFile.close()

                this.executionContext.artifactExecution.enableTarpit()
                this.executionContext.artifactExecution.enableEntityEca()
                this.executionContext.artifactExecution.enableAuthz()
            }
        })
    }
}
4

1 回答 1

1

这似乎工作正常,所以如果有人有类似的麻烦,它可能会有所帮助。

  1. 因为,我使用的是 MoquiDevConf,首先要解决慢问题是删除 for 类型 AT_ENTITY,
  2. 接下来,BufferedReader 并不是读取数据最有效的解决方案,我使用 InputStream 来初始化 json ArrayList。

这是结果:

InputStream inputFile = docReadReference.openStream()
        TransactionFacade trxFacade = this.executionContext.getTransaction()


        JsonSlurper slurper = new JsonSlurper().setType(JsonParserType.INDEX_OVERLAY)
        //ArrayList<Object> json = slurper.parse(new BufferedReader(new InputStreamReader(inputFile, "UTF-8")))
        ArrayList<Object> json = slurper.parse(inputFile)
于 2017-05-17T05:33:27.940 回答