1

I'm seeing this exception in the syslogs of failed map tasks, all map tasks in a particular job are encountering this error.

Any guesses as to the cause here, this is a strange looking stack trace to my eyes.

2012-12-29 10:37:37,975 FATAL org.apache.hadoop.mapred.Child (main): Error running child : java.lang.StackOverflowError
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:80)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.setup(DelegatingMapper.java:46)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:54)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
    at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
    <1014 duplicate lines cut>
4

1 回答 1

2

Looking on sources of DelegationMapper.java I can suspect that you somehow set DelegationMapper to be your actual Mapper class. As a result - it infinitely delegates run to itself.

于 2013-01-02T08:41:04.800 回答