I'm seeing this exception in the syslogs of failed map tasks, all map tasks in a particular job are encountering this error.
Any guesses as to the cause here, this is a strange looking stack trace to my eyes.
2012-12-29 10:37:37,975 FATAL org.apache.hadoop.mapred.Child (main): Error running child : java.lang.StackOverflowError
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:80)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.setup(DelegatingMapper.java:46)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:54)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
<1014 duplicate lines cut>