2

我对 Pig Latin 非常陌生,试图通过我的作业来理解 Pig Latin,这是我之前使用 Map Reduce 所做的。而且我收到 GC 开销错误,PS:我的输入只是 10 行简单的 csv 文件。

我正在尝试将 csv 格式转换为 arff。

我的UDF:

public class CSV2ARFF extends EvalFunc<String> {
private String arffDataString;
private String arffHeaderString;

public String exec(Tuple input) throws IOException {
    if (input == null || input.size() == 0)
        return null;
    try{
            System.out.println(">>> " + input.get(0).toString());
            // csv to instances
            ByteArrayInputStream inputStream = new ByteArrayInputStream(input.get(0).toString().getBytes("UTF-8"));
            CSVLoader loader = new CSVLoader();             
            loader.setSource(inputStream);
            Instances data = loader.getDataSet(); //**Line #30**
            //convert into arff
            ArffSaver arff = new ArffSaver();               
            arff.setInstances(data);                
            this.arffDataString = arff.getInstances().toString();               
            Instances arffdata = arff.getInstances();
            // header
            Instances header = new Instances(arffdata, 0);
            this.arffHeaderString = header.toString();
            this.arffDataString = this.arffDataString.substring(this.arffHeaderString.length());

            return arffDataString;

    }catch(Exception e){
        System.err.println("CSV2ARFF: failed to proces input; error - " + e.getMessage());
        return null;
    }
}

}

我的脚本.pig

REGISTER ./csv2arff.jar;
REGISTER ./weka.jar;

csvraw = LOAD 'sample' USING PigStorage('\n') as (c);

arffraws = FOREACH csvraw GENERATE pighw2java.CSV2ARFF(c);

--output

STORE arffraws INTO 'output' using PigStorage();

错误

java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.nio.CharBuffer.wrap(CharBuffer.java:369)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:310)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:154)
at java.io.BufferedReader.read(BufferedReader.java:175)
at java.io.StreamTokenizer.read(StreamTokenizer.java:500)
at java.io.StreamTokenizer.nextToken(StreamTokenizer.java:544)
at weka.core.converters.ConverterUtils.getToken(ConverterUtils.java:888)
at weka.core.converters.CSVLoader.readHeader(CSVLoader.java:937)
at weka.core.converters.CSVLoader.readStructure(CSVLoader.java:578)
at weka.core.converters.CSVLoader.getStructure(CSVLoader.java:563)
at weka.core.converters.CSVLoader.getDataSet(CSVLoader.java:596)
at pighw2java.CSV2ARFF.exec(CSV2ARFF.java:30)
at pighw2java.CSV2ARFF.exec(CSV2ARFF.java:1)
4

1 回答 1

0

我遇到过类似的情况。在本地模式下运行 pig 会导致此错误(pig -x local)。当我在 Map reduce 模式下运行相同的查询时,它得到了解决(猪)。

希望能帮助到你。

于 2017-04-07T06:14:51.427 回答