5

从 S3 到 HDFS 运行 s3distcp 时:

 sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/s3distcp.jar --src s3n://workAAAA-KKKK-logs/production-logs/Log-XXXX-click/Log-XXXXX-click-2013-03-27_06-21-19_i-7XXb2x39_00037.gz  --dest hdfs:///test/

我得到以下异常。

我的路径语法 (s3n:// ; hdfs:///) 有问题吗?有没有人遇到过这个问题?

13/04/04 12:10:52 INFO s3distcp.S3DistCp: Using output path 'hdfs:/tmp/96a8e57b-4c68-406c-b4ca-bf212de12d93/output'
13/04/04 12:10:53 INFO s3distcp.FileInfoListing: Opening new file: hdfs:/tmp/96a8e57b-4c68-406c-b4ca-bf212de12d93/files/1
Exception in thread "main" java.lang.IllegalArgumentException: Can not create a Path from an empty string
        at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
        at org.apache.hadoop.fs.Path.<init>(Path.java:99)
        at org.apache.hadoop.fs.Path.<init>(Path.java:58)
        at com.amazon.external.elasticmapreduce.s3distcp.FileInfoListing.getOutputFilePath(FileInfoListing.java:155)
        at com.amazon.external.elasticmapreduce.s3distcp.FileInfoListing.add(FileInfoListing.java:111)
        at com.amazon.external.elasticmapreduce.s3distcp.FileInfoListing.add(FileInfoListing.java:78)
        at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.createInputFileListS3(S3DistCp.java:122)
        at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.createInputFileList(S3DistCp.java:60)
        at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:529)
        at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:216)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at com.amazon.external.elasticmapreduce.s3distcp.Main.main(Main.java:12)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
4

2 回答 2

1

如果您需要,有一种方法可以请求特定文件。您可以使用--copyFromManifest选项,它允许您为 s3distcp 提供一个包含所有文件路径的清单文件(即使在不同的文件夹中)。

于 2015-02-24T08:16:39.880 回答
0

当您尝试写入即使存在但没有访问权限的路径时,也会出现此问题。

当您尝试在不存在的 Redshift 模式中写入时,也会出现这种情况。

于 2018-07-25T14:32:50.517 回答