我有一个在 Hadoop-1 和 Hive-0.12 上运行良好的 GenericUDF(参见下面的代码)。但是当使用 Hive-0.13 + Hadoop-2 测试相同的 GenericUDF 时,我收到以下错误。
顶点失败,vertexName=Map 12,vertexId=vertex_1409698731658_42202_1_00,diagnostics=[Vertex Input: ccv initializer failed., org.apache.hive.com.esotericsoftware.kry o.KryoException: Unable to find class: com.xxx.xxx.Id1
这是我的 UDF 的代码。
package com.xxx.xxx;
import org.apache.hadoop.hive.*;
public class Id1 extends GenericUDF {
private MapredContext context;
private long sequenceNum = 0;
private static final int padLength = 10;
StringBuilder sb = null;
public ObjectInspector initialize(ObjectInspector[] arguments)
throws UDFArgumentException {
sequenceNum = 0;
sb = new StringBuilder();
return PrimitiveObjectInspectorFactory.javaStringObjectInspector;
}
public Object evaluate(DeferredObject[] arguments) throws HiveException {
int sbLength = sb.toString().length();
if (sbLength > 0)
sb.replace(0, sbLength, "");
String taskId = null;
if (context.getJobConf() != null)
taskId = context.getJobConf().get("mapred.taskid");
sequenceNum++;
if (taskId != null) {
sb.append(taskId.replace("attempt_", ""));
}
int i = 0;
String seqStr = String.valueOf(sequenceNum);
sb.append(seqStr);
return sb.toString();
}
public String getDisplayString(String[] children) {
return "id1()";
}
@Override
public void configure(MapredContext context) {
this.context = context;
}
}
我确定这与 Hive-0.13 有关,但看不到任何与此错误相关的帖子。