1

我想使用 java 代码将 hdfs 中存在的文件复制到 s3 存储桶。我的 java 代码实现如下所示:

import org.apache.hadoop.tools.DistCp;
import org.apache.hadoop.tools.DistCpOptions;
import org.apache.hadoop.tools.OptionsParser;
import org.apache.hadoop.conf.Configuration;

private void setHadoopConfiguration(Configuration conf) {

        conf.set("fs.defaultFS", hdfsUrl);
        conf.set("fs.s3a.access.key", s3AccessKey);
        conf.set("fs.s3a.secret.key", s3SecretKey);
        conf.set("fs.s3a.endpoint", s3EndPoint);
        conf.set("hadoop.job.ugi", hdfsUser);
        System.setProperty("com.amazonaws.services.s3.enableV4", "true");
  
    }

public static void main(String[] args){
  
        Configuration conf = new Configuration();
        setHadoopConfiguration(conf);
      try {
                DistCpOptions distCpOptions = OptionsParser.parse(new String[]{srcDir, dstDir});
                DistCp distCp = new DistCp(conf, distCpOptions);
                distCp.execute();
          } 
      catch (Exception e) { 
                   logger.info("Exception occured while copying file {}", srcDir);
                   logger.error("Error ", e);
         }
}

现在这段代码运行良好,但问题是它没有在纱线集群上启动 distcp 作业。它会启动本地作业运行程序,因为它会在大文件副本的情况下超时。

[2020-08-23 21:16:53.759][LocalJobRunner Map Task Executor #0][INFO][S3AFileSystem:?] Getting path status for s3a://***.distcp.tmp.attempt_local367303638_0001_m_000000_0 (***.distcp.tmp.attempt_local367303638_0001_m_000000_0)
[2020-08-23 21:16:53.922][LocalJobRunner Map Task Executor #0][INFO][S3AFileSystem:?] Delete path s3a://***.distcp.tmp.attempt_local367303638_0001_m_000000_0 - recursive false
[2020-08-23 21:16:53.922][LocalJobRunner Map Task Executor #0][INFO][S3AFileSystem:?] Getting path status for s3a://*** .distcp.tmp.attempt_local367303638_0001_m_000000_0 (**.distcp.tmp.attempt_local367303638_0001_m_000000_0)
[2020-08-23 21:16:54.007][LocalJobRunner Map Task Executor #0][INFO][S3AFileSystem:?] Getting path status for s3a://****
[2020-08-23 21:16:54.118][LocalJobRunner Map Task Executor #0][ERROR][RetriableCommand:?] Failure in Retriable command: Copying hdfs://*** to s3a://***
com.amazonaws.SdkClientException: Unable to execute HTTP request: Read timed out
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleRetryableException(AmazonHttpClient.java:1189)
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1135)

请帮助我了解如何配置纱线配置,以便 distcp 作业在集群上运行而不是在本地运行

4

0 回答 0