当我运行我的代码时,我ArrayIndexOutOfBoundsException
从减速器的任务中得到错误。
我的代码如下:
public void map(ImageHeader key, FloatImage value, Context context) throws IOException, InterruptedException{
if (value != null) {
mapcounter++;
FloatImage gray=new FloatImage(value.getWidth(),value.getHeight(),value.getBands());
int imageWidth = value.getWidth();
int imageHeight = value.getHeight();
for (int x = 0; x < imageWidth-1; x++) {
for (int y = 0; y < imageHeight-1; y++) {
float red =value.getPixel(x, y, 0);
float green =value.getPixel(x, y, 1);
float blue =value.getPixel(x, y, 2);
//average of RGB
float avg = (red + blue + green)/3;
//set R, G & B with avg color
gray.setPixel(x, y, 0, avg);
gray.setPixel(x, y, 1, avg);
gray.setPixel(x, y, 2, avg);
}
}
ImageEncoder encoder = JPEGImageUtil.getInstance();
FSDataOutputStream os = fileSystem.create(outpath);
encoder.encodeImage(gray, key, os);
os.flush();
os.close();
context.write(new BooleanWritable(true), new LongWritable(1));
}
else
context.write(new BooleanWritable(false), new LongWritable(0));
}
public static class MyReducer extends Reducer<BooleanWritable, LongWritable, BooleanWritable, LongWritable> {
public void reduce(BooleanWritable key, Iterable<LongWritable> values, Context context)
throws IOException, InterruptedException
{
System.out.println("REDUCING");
for (LongWritable temp_hash : values)
{
context.write(new BooleanWritable(true), new LongWritable(1));
}//for
}
}
错误如下:
...
12/12/30 09:06:01 INFO mapred.JobClient: map 100% reduce 33%
12/12/30 09:06:03 INFO mapred.JobClient: Task Id : attempt_201212271308_0005_r_000000_0, Status : FAILED
java.lang.ArrayIndexOutOfBoundsException: 1
at org.apache.hadoop.io.WritableComparator.readInt(WritableComparator.java:153)
at org.apache.hadoop.io.BooleanWritable$Comparator.compare(BooleanWritable.java:103)
at org.apache.hadoop.mapreduce.ReduceContext.nextKeyValue(ReduceContext.java:120)
at org.apache.hadoop.mapreduce.ReduceContext.nextKey(ReduceContext.java:92)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:175)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:566)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
我该如何解决这个问题?
第二个问题:如何在我的程序中忽略 reduce 阶段而不运行 reduce 阶段?