1

我的集群是 Ambari Hortonworks(ambari-server-2.0.1-45) HDP 2.2:

我想将 hdfs 文件夹备份到 AWS S3。我使用了以下命令:hadoop distcp hdfs://internalip:8020/backup s3://AWS-ID:AWS-SECRET-KEY@BUCKET-NAME/DIRECTORY-NAME

我尝试了如何从本地 Hadoop 2.6 安装访问 S3/S3n?

但我仍然收到以下错误:

Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3.S3FileSystem not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2076)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2601)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2614)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2653)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2635)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
    at org.apache.hadoop.tools.mapred.CopyMapper.setup(CopyMapper.java:112)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.tools.DistCp.execute(DistCp.java:175)
4

0 回答 0