我想取输入文件中给出的温度的平均值,我的 Mapper 和 Reducer 语法对我来说似乎很好,但我仍然收到以下错误:
Unable to load realm info from SCDynamicStore
13/02/17 08:03:28 INFO mapred.JobClient: Task Id : attempt_201302170552_0009_m_000000_1, Status : FAILED
java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.IntWritable
at org.apache.hadoop.examples.TempMeasurement$TempMapper.map(TempMeasurement.java:26)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
我的映射器功能是这样的:
public static class TempMapper extends Mapper<IntWritable, Text, IntWritable, FloatWritable>{
@Override
protected void map(IntWritable key, Text value, Context context)
throws IOException, InterruptedException {
//code for getting date and temperature
String temp = columns.get(3);
context.write(new IntWritable(year), new FloatWritable(Float.valueOf(temp)));
}
}
而减少是:
public static class IntSumReducer
extends Reducer<IntWritable, FloatWritable, IntWritable ,FloatWritable> {
private FloatWritable result = new FloatWritable();
public void reduce(IntWritable key, Iterable<FloatWritable> values,
Context context
) throws IOException, InterruptedException {
//code for making calculations
context.write(key, result);
}
}
输入文件如下:
11111 , 0,19900101, 44.04 ,
11112, 0, 19900102, 50.00,
11113, 3, 19910203, 30.00,
任何帮助,将不胜感激