我正在尝试从 txt 文件中读取并写入 HBase。
Job Class
Job job = new Job(conf, "HWriterJob");
job.setJarByClass(HWriterJob.class);
FileInputFormat.setInputPaths(job, new Path(otherArgs[0]));
job.setMapperClass(TokenizerMapper.class);
job.setOutputKeyClass(ImmutableBytesWritable.class);
job.setOutputValueClass(Put.class);
TableMapReduceUtil.initTableReducerJob(table,null,job);
Mapper Class
@Override
public void map(Text key, Text value, Context context)
throws IOException, InterruptedException {
String line = value.toString();
StringTokenizer st = new StringTokenizer(line, "|");
String result[] = new String[st.countTokens()];
int i = 0;
while (st.hasMoreTokens()) {
result[i] = st.nextToken();
i++;
}
Map<ImmutableBytesWritable,Put> resultSet = writeToHBase(result);
for (Map.Entry<ImmutableBytesWritable,Put> entry : resultSet.entrySet()) {
context.write(new Text(entry.getValue().getRow()), entry.getValue());
}
}
Reducer Class
public void reduce(Text key, Iterable<Put> values, Context context)
throws IOException, InterruptedException {
for (Put val : values) {
context.write(key, val);
}
}
但我做同样的事情没有成功。
我收到以下错误 java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text