2

配置单元确实需要处理 45 个文件。每个大小约为 1GB。映射器执行完成 100% 后,配置单元失败并显示上述错误消息。

Driver returned: 1.  Errors: OK
Hive history file=/tmp/hue/hive_job_log_hue_201308221004_1738621649.txt
Total MapReduce jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1376898282169_0441, Tracking URL = http://SH02SVR2882.hadoop.sh2.ctripcorp.com:8088/proxy/application_1376898282169_0441/
Kill Command = //usr/lib/hadoop/bin/hadoop job  -kill job_1376898282169_0441
Hadoop job information for Stage-1: number of mappers: 236; number of reducers: 0
2013-08-22 10:04:40,205 Stage-1 map = 0%,  reduce = 0%
2013-08-22 10:05:07,486 Stage-1 map = 1%,  reduce = 0%, Cumulative CPU 121.28 sec
.......................
2013-08-22 10:09:18,625 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 7707.18 sec
MapReduce Total cumulative CPU time: 0 days 2 hours 8 minutes 27 seconds 180 msec
Ended Job = job_1376898282169_0441
Ended Job = -541447549, job is filtered out (removed at runtime).
Ended Job = -1652692814, job is filtered out (removed at runtime).
Launching Job 3 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Job Submission failed with exception 
'java.io.IOException(Max block location exceeded for split: Paths:/tmp/hive-beeswax-logging/hive_2013-08-22_10-04-32_755_6427103839442439579/-ext-10001/000009_0:0+28909,....,/tmp/hive-beeswax-logging/hive_2013-08-22_10-04-32_755_6427103839442439579/-ext-10001/000218_0:0+45856 
Locations:10.8.75.17:...:10.8.75.20:; InputFormatClass: org.apache.hadoop.mapred.TextInputFormat
 splitsize: 45 maxsize: 10)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
MapReduce Jobs Launched: 
Job 0: Map: 236   Cumulative CPU: 7707.18 sec   HDFS Read: 63319449229 HDFS Write: 8603165 SUCCESS
Total MapReduce CPU Time Spent: 0 days 2 hours 8 minutes 27 seconds 180 msec

但我没有设置最大尺寸。执行了很多次,但得到了同样的错误。我尝试为 hive 添加 mapreduce.jobtracker.split.metainfo.maxsize 属性。但在这种情况下,hive 在没有任何地图工作的情况下失败了。

4

1 回答 1

2

设置 mapreduce.job.max.split.locations > 45

在我们的情况下,它解决了问题。

于 2013-08-23T09:40:53.810 回答