我正在开发一个在 Flink 1.14 上使用 UDF 的 Java 应用程序。我正在使用PipelineOptions.JARS
配置在应用程序代码中动态添加包含 UDF 类的 jar 文件,但是应用程序无法从配置的 jar 文件中加载 UDF 类ClassNotFoundException
。
我也试过PipelineOptions.CLASSPATHS
了,它失败了,错误和堆栈跟踪完全相同。
如果通过 Flink CLI 使用“flink run”和“-C”选项更新类路径,相同的应用程序 jar 可以正常工作:
<FLINK_HOME>/bin/flink run --detached -C file:///path/to/udf.jar ...
问题似乎是 table planner 中的 codegen 使用的 ClassLoader 的类路径没有根据传递给的 Configuration 进行更新StreamExecutionEnvironment
,我不确定如何做到这一点。
以下是添加 UDF jar 文件和注册 UDF 的方式:
final Configuration configuration = new Configuration();
configuration.set(PipelineOptions.JARS,Collections.singletonList("file:///path/to/udf.jar"));
StreamExecutionEnvironment streamEnv = StreamExecutionEnvironment.getExecutionEnvironment(configuration);
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(streamEnv);
...
Class udfClass = Class.forName("demo.MyUDF", ...);
tableEnv.createTemporarySystemFunction("MyUDF", udfClass);
...
这是错误堆栈跟踪:
Exception in thread "main" java.lang.ClassNotFoundException: demo.MyUDF
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at org.apache.flink.util.InstantiationUtil$ClassLoaderObjectInputStream.resolveClass(InstantiationUtil.java:78)
at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1886)
at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:617)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:602)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:589)
at org.apache.flink.table.planner.codegen.CodeGeneratorContext.addReusableObjectInternal(CodeGeneratorContext.scala:692)
at org.apache.flink.table.planner.codegen.CodeGeneratorContext.addReusableFunction(CodeGeneratorContext.scala:714)
at org.apache.flink.table.planner.codegen.calls.BridgingFunctionGenUtil$.generateFunctionAwareCall(BridgingFunctionGenUtil.scala:130)
at org.apache.flink.table.planner.codegen.calls.BridgingFunctionGenUtil$.generateFunctionAwareCallWithDataType(BridgingFunctionGenUtil.scala:116)
at org.apache.flink.table.planner.codegen.calls.BridgingFunctionGenUtil$.generateFunctionAwareCall(BridgingFunctionGenUtil.scala:73)
at org.apache.flink.table.planner.codegen.calls.BridgingSqlFunctionCallGen.generate(BridgingSqlFunctionCallGen.scala:81)
at org.apache.flink.table.planner.codegen.ExprCodeGenerator.generateCallExpression(ExprCodeGenerator.scala:825)
at org.apache.flink.table.planner.codegen.ExprCodeGenerator.visitCall(ExprCodeGenerator.scala:503)
at org.apache.flink.table.planner.codegen.ExprCodeGenerator.visitCall(ExprCodeGenerator.scala:58)org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:70)
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:185)
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.toStreamInternal(StreamTableEnvironmentImpl.java:437)
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.toStreamInternal(StreamTableEnvironmentImpl.java:432)
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.toDataStream(StreamTableEnvironmentImpl.java:356)
...