我正在尝试使用 Parameter 实现 Hive UDF,因此我正在扩展GenericUDF类。
问题是我在字符串数据类型上找到的 UDF 作品,但是如果我在其他数据类型上运行它会引发错误。无论数据类型如何,我都希望 UDF 运行。
有人请让我知道以下代码有什么问题。
@Description(name = "Encrypt", value = "Encrypt the Given Column", extended = "SELECT Encrypt('Hello World!', 'Key');")
public class Encrypt extends GenericUDF {
StringObjectInspector key;
StringObjectInspector col;
@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
if (arguments.length != 2) {
throw new UDFArgumentLengthException("Encrypt only takes 2 arguments: T, String");
}
ObjectInspector keyObject = arguments[1];
ObjectInspector colObject = arguments[0];
if (!(keyObject instanceof StringObjectInspector)) {
throw new UDFArgumentException("Error: Key Type is Not String");
}
this.key = (StringObjectInspector) keyObject;
this.col = (StringObjectInspector) colObject;
return PrimitiveObjectInspectorFactory.javaStringObjectInspector;
}
@Override
public Object evaluate(DeferredObject[] deferredObjects) throws HiveException {
String keyString = key.getPrimitiveJavaObject(deferredObjects[1].get());
String colString = col.getPrimitiveJavaObject(deferredObjects[0].get());
return AES.encrypt(colString, keyString);
}
@Override
public String getDisplayString(String[] strings) {
return null;
}
}
错误
java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaIntObjectInspector 不能转换为 org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector