0

当我需要解析 abig rdf 文件时,我有一个 maven 项目。

我的代码是:

import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;

import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;

public class ConvertOntology {

    public static void main(String[] args) throws RDFParseException, RDFHandlerException, IOException {
        // TODO Auto-generated method stub

        String file = "C:\\Users\\user\\Desktop\\fileA.rdf";

        File initialFile = new File(file);
        InputStream input = new FileInputStream(initialFile);


        RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
        parser.setPreserveBNodeIDs(true); 

        Model model = new LinkedHashModel();
        parser.setRDFHandler(new StatementCollector(model));
        parser.parse(input, initialFile.getAbsolutePath());

        FileOutputStream out = new FileOutputStream("C:\\Users\\user\\Desktop\\fileB.rdf");
        RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
        try {
          writer.startRDF();
          for (Statement st: model) {

                    writer.handleStatement(st);
          }
          writer.endRDF();
        }
        catch (RDFHandlerException e) {
         // oh no, do something!
        }
        finally {
          out.close();
        }
    }

}

小文件的代码字很好,但对于大文件,我得到以下异常

JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK

在 Eclipse 中,我通过单击 run>>runco​​nfiguration>>argument 运行项目,然后在 VM 参数中设置 -DentityExpansionLimit=1000000。由于内存限制,我得到了一个新的异常:

the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded

所以我可以设置的最大堆比文件需要的小。所以我想在服务器上执行我的代码。通常我通过以下方式在服务器上编译和运行我的 Maven:

mvn compile
mv exec:java

我的问题:我在 maven 中设置 -DentityExpansionLimit=5000000

mvn -DentityExpansionLimit=5000000 exec:java

但我得到了一个原始例外:

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]

如何解决这个问题?

4

2 回答 2

1

通过使用 mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java我解决了我的问题。希望这会有所帮助

于 2018-11-19T15:00:35.060 回答
0

根据文档,您可以使用负值来消除限制。

于 2018-11-18T23:12:17.143 回答