2

我有一个本地库(Freeling),我使用 cmake 编译并本地制作并通过集群启动操作安装(因此,它应该存在于 master 和每个 worker 中)

即便如此,我在调用时收到此错误System.loadLibrary

Exception in thread "main" java.lang.UnsatisfiedLinkError: no Jfreeling in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
    at java.lang.Runtime.loadLibrary0(Runtime.java:870)
    at java.lang.System.loadLibrary(System.java:1122)

我尝试使用以下属性让程序找到库(它在静态代码块中调用)

      "properties": {
        "spark.driver.extraClassPath": "/usr/local/share/freeling/APIs/java/Jfreeling.jar:/usr/local/lib/libfreeling.so",
        "spark.executor.extraClassPath": "/usr/local/share/freeling/APIs/java/Jfreeling.jar:/usr/local/lib/libfreeling.so",
        "spark.executor.extraLibraryPath": "/usr/local/lib/libfreeling.so",
        "spark.driver.extraLibraryPath": "/usr/local/lib/libfreeling.so",
        "spark.executorEnv.LD_PRELOAD": "/usr/local/lib/libfreeling.so",
        "spark.yarn.dist.files": "/usr/local/lib/libfreeling.so",
        "spark.yarn.appMasterEnv.LD_PRELOAD": "libfreeling.so",
        "spark.files": "/usr/local/lib/libfreeling.so",
        "spark.executorEnv.LD_LIBRARY_PATH": "libfreeling.so"
      },
      "jarFileUris": [
        "file:///usr/local/share/freeling/APIs/java/Jfreeling.jar",
        "file:///usr/local/lib/libfreeling.so"
      ],
4

2 回答 2

1

你能试着把你的图书馆放在下面/usr/lib/hadoop/lib/native/吗?中/etc/spark/conf/spark-env.sh,有

# Spark got rid of SPARK_LIBRARY_PATH in 1.0
# It has properties for extraLibraryPaths, but this is more extensible
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${HADOOP_HOME}/lib/native
于 2019-07-24T22:05:37.157 回答
0

您应该将 /usr/local/share/freeling/APIs/java/Jfreeling.jar 添加到 CLASSPATH,并将 /usr/local/share/freeling/APIs/java/libJfreeling.so 添加到 LD_LIBRARY_PATH。

于 2019-08-15T08:06:37.027 回答