0

我正在尝试通过 AWS SDK 获取此命令:

hadoop jar /home/hadoop/contrib/streaming/hadoop-streaming.jar -input hdfs:///logs/ -output hdfs:///no_dups -mapper dedup_mapper.py -reducer dedup_reducer.py -file deduplication.py dedup_mapper.py dedup_reducer.py timber.py signature_v4.py

我的Java代码是:

HadoopJarStepConfig config = new StreamingStep()
        .withInputs("hdfs:///logs")
        .withOutput("hdfs:///no_dups")
        .withMapper("dedup_mapper.py")
        .withReducer("dedup_reducer.py")
        .toHadoopJarStepConfig();

Collection<String> aggs = config.getArgs();
aggs.add("-file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.py");
config.setArgs(aggs);

这会产生以下 AddJobFlowStepsRequest (调用 toString() 时):

{JobFlowId: j-3TDECOMCOO8HE, Steps: [{Name: DeDup, ActionOnFailure: CONTINUE, HadoopJarStep: {Properties: [], Jar: /home/hadoop/contrib/streaming/hadoop-streaming.jar, Args: [-input, hdfs:///logs, -output, hdfs:///no_dups, -mapper, dedup_mapper.py, -reducer, dedup_reducer.py, -file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.py], }, }], }

最后,我在主节点上看到的错误:

2013-04-26 16:43:48,116 ERROR org.apache.hadoop.streaming.StreamJob (main): Unrecognized option: -file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.p

奇怪的是错误日志列出了可用的选项,而-file 就是其中之一。有没有其他人看到这个问题?

更多日志:

2013-04-26T16:43:46.105Z INFO Fetching jar file.

2013-04-26T16:43:47.609Z INFO Working dir /mnt/var/lib/hadoop/steps/9

2013-04-26T16:43:47.609Z INFO Executing /usr/lib/jvm/java-6-sun/bin/java -cp /home/hadoop/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/hadoop:/home/hadoop/hadoop-core-1.0.3.jar:/home/hadoop/hadoop-tools.jar:/home/hadoop/hadoop-tools-1.0.3.jar:/home/hadoop/hadoop-core.jar:/home/hadoop/lib/*:/home/hadoop/lib/jetty-ext/* -Xmx1000m -Dhadoop.log.dir=/mnt/var/log/hadoop/steps/9 -Dhadoop.log.file=syslog -Dhadoop.home.dir=/home/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,DRFA -Djava.io.tmpdir=/mnt/var/lib/hadoop/steps/9/tmp -Djava.library.path=/home/hadoop/native/Linux-amd64-64 org.apache.hadoop.util.RunJar /home/hadoop/contrib/streaming/hadoop-streaming.jar -input hdfs:///logs -output hdfs:///no_dups -mapper dedup_mapper.py -reducer dedup_reducer.py -file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.py

2013-04-26T16:43:48.611Z INFO Execution ended with ret val 1

2013-04-26T16:43:48.612Z WARN Step failed with bad retval
4

1 回答 1

0

出现错误的原因是因为整个命令被解释为单个命令选项。

解决方案是添加命令选项,然后添加如下参数:

args.add("-file");
args.add("myfile.txt");

如果你想添加多个文件,那么你可以这样做:

args.add("-file");
args.add("myfile.txt");
args.add("-file");
args.add("myfile2.txt");

如果您只是在一个参数中将文件作为列表提供,那么整行将被解释为文件名,并且可能会引发错误。

于 2013-04-29T09:41:13.520 回答