我正在尝试使用 distcp 将数据从本地 hadoop 集群复制到 S3 存储桶。
有时它“有效”,但一些映射器失败,堆栈跟踪如下。其他时候,太多的映射器失败了,整个工作都取消了。
错误“任何本地目录中都没有可用空间”。对我来说没有意义。边缘节点(运行 distcp 命令的地方)、集群和 S3 存储桶中有大量空间。
任何人都可以对此有所了解吗?
16/06/16 15:48:08 INFO mapreduce.Job: The url to track the job: <url>
16/06/16 15:48:08 INFO tools.DistCp: DistCp job-id: job_1465943812607_0208
16/06/16 15:48:08 INFO mapreduce.Job: Running job: job_1465943812607_0208
16/06/16 15:48:16 INFO mapreduce.Job: Job job_1465943812607_0208 running in uber mode : false
16/06/16 15:48:16 INFO mapreduce.Job: map 0% reduce 0%
16/06/16 15:48:23 INFO mapreduce.Job: map 33% reduce 0%
16/06/16 15:48:26 INFO mapreduce.Job: Task Id : attempt_1465943812607_0208_m_000001_0, Status : FAILED
Error: java.io.IOException: File copy failed: hdfs://<hdfs path>/000000_0 --> s3n://<bucket>/<s3 path>/000000_0
at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:285)
at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:253)
at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:50)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://<hdfs path>/000000_0 to s3n://<bucket>/<s3 path>/000000_0
at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:281)
... 10 more
Caused by: org.apache.hadoop.util.DiskChecker$DiskErrorException: No space available in any of the local directories.
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:366)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:416)
at org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:198)
at org.apache.hadoop.fs.s3native.NativeS3FileSystem$NativeS3FsOutputStream.newBackupFile(NativeS3FileSystem.java:263)
at org.apache.hadoop.fs.s3native.NativeS3FileSystem$NativeS3FsOutputStream.<init>(NativeS3FileSystem.java:245)
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.create(NativeS3FileSystem.java:412)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:986)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToFile(RetriableFileCopyCommand.java:174)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:123)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:99)
at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
... 11 more