我想使用TwoDArrayWritable
as value 发出一个二维双精度数组。
如何写context.write(key , )
编辑
以及Reducer
如何将get
它们放入二维双精度数组和print
值中。
我Wrote
在 Mapper
row = E.length;
col = E[0].length;
TwoDArrayWritable array = new TwoDArrayWritable (DoubleWritable.class);
DoubleWritable[][] myInnerArray = new DoubleWritable[row][col];
// set values in myInnerArray
for (int k1 = 0; k1 < row; k1++) {
for(int j1=0;j1< col;j1++){
myInnerArray[k1][j1] = new DoubleWritable(E[k1][j1]);
}
array.set(myInnerArray);
context.write(clusterNumber, array);
但显示一个Nullpointer exception
13/11/01 16:34:07 INFO mapred.LocalJobRunner: Map task executor complete.
13/11/01 16:34:07 WARN mapred.LocalJobRunner: job_local724758890_0001
java.lang.Exception: java.lang.NullPointerException
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.NullPointerException
at org.apache.hadoop.io.TwoDArrayWritable.write(TwoDArrayWritable.java:91)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:945)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:601)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at edu.Mapper.map(Mapper.java:277)
Mapper.java:277 : context.write(clusterNumber, array);