1

在命令行上使用 pyspark 启动时pyspark,一切都按预期工作。但是,当使用 Livy 时,它不会。

我使用 Postman 建立了连接。首先,我将其发布到sessions端点:

{
  "kind": "pyspark",
  "proxyUser": "spark"
}

会话启动,我可以看到 Spark 在 YARN 上启动。但是,我在容器日志中收到此错误:

18/09/12 15:53:00 ERROR repl.PythonInterpreter: Process has died with 1
18/09/12 15:53:00 ERROR repl.PythonInterpreter: Traceback (most recent call last):
  File "/yarn/nm/usercache/livy/appcache/application_1535188013308_0051/container_1535188013308_0051_01_000001/tmp/3015653701235928503", line 643, in <module>
    sys.exit(main())
  File "/yarn/nm/usercache/livy/appcache/application_1535188013308_0051/container_1535188013308_0051_01_000001/tmp/3015653701235928503", line 533, in main
    exec('from pyspark.shell import sc', global_dict)
  File "<string>", line 1, in <module>
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/shell.py", line 38, in <module>
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/context.py", line 292, in _ensure_initialized
  File "/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2/python/lib/pyspark.zip/pyspark/java_gateway.py", line 47, in launch_gateway
  File "/usr/lib64/python2.7/UserDict.py", line 23, in __getitem__
    raise KeyError(key)
KeyError: 'PYSPARK_GATEWAY_SECRET'

的输出sessions/XYZ/log是:

{
    "id": 16,
    "from": 0,
    "total": 46,
    "log": [
        "stdout: ",
        "\nstderr: ",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/livy-api-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/livy-rsc-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/rsc/netty-all-4.0.29.Final.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/commons-codec-1.9.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/livy-core_2.11-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/livy/repl/livy-repl_2.11-0.4.0-SNAPSHOT.jar.",
        "Warning: Skip remote jar hdfs://master1.lama.nuc:8020/lama/lama.main-assembly-0.9.0-spark2.3.0-hadoop2.6.5-SNAPSHOT.jar.",
        "18/09/12 15:52:50 INFO client.RMProxy: Connecting to ResourceManager at master1.lama.nuc/192.168.42.100:8032",
        "18/09/12 15:52:51 INFO yarn.Client: Requesting a new application from cluster with 6 NodeManagers",
        "18/09/12 15:52:51 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (12288 MB per container)",
        "18/09/12 15:52:51 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead",
        "18/09/12 15:52:51 INFO yarn.Client: Setting up container launch context for our AM",
        "18/09/12 15:52:51 INFO yarn.Client: Setting up the launch environment for our AM container",
        "18/09/12 15:52:51 INFO yarn.Client: Preparing resources for our AM container",
        "18/09/12 15:52:51 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/livy-api-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/livy-rsc-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/rsc/netty-all-4.0.29.Final.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/commons-codec-1.9.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/livy-core_2.11-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/livy/repl/livy-repl_2.11-0.4.0-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://master1.lama.nuc:8020/lama/lama.main-assembly-0.9.0-spark2.3.0-hadoop2.6.5-SNAPSHOT.jar",
        "18/09/12 15:52:52 INFO yarn.Client: Uploading resource file:/tmp/spark-37413ebc-9427-44d8-8a01-c4222eb899f8/__spark_conf__7516701035111969209.zip -> hdfs://master1.lama.nuc:8020/user/livy/.sparkStaging/application_1535188013308_0051/__spark_conf__.zip",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing view acls to: livy",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing modify acls to: livy",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing view acls groups to: ",
        "18/09/12 15:52:53 INFO spark.SecurityManager: Changing modify acls groups to: ",
        "18/09/12 15:52:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(livy); groups with view permissions: Set(); users  with modify permissions: Set(livy); groups with modify permissions: Set()",
        "18/09/12 15:52:57 INFO yarn.Client: Submitting application application_1535188013308_0051 to ResourceManager",
        "18/09/12 15:52:57 INFO impl.YarnClientImpl: Submitted application application_1535188013308_0051",
        "18/09/12 15:52:57 INFO yarn.Client: Application report for application_1535188013308_0051 (state: ACCEPTED)",
        "18/09/12 15:52:57 INFO yarn.Client: ",
        "\t client token: N/A",
        "\t diagnostics: N/A",
        "\t ApplicationMaster host: N/A",
        "\t ApplicationMaster RPC port: -1",
        "\t queue: root.users.livy",
        "\t start time: 1536760377659",
        "\t final status: UNDEFINED",
        "\t tracking URL: http://master1.lama.nuc:8088/proxy/application_1535188013308_0051/",
        "\t user: livy",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Shutdown hook called",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-795d9b05-a5ad-4930-ad8b-77034022bc17",
        "18/09/12 15:52:57 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-37413ebc-9427-44d8-8a01-c4222eb899f8",
        "\nYARN Diagnostics: "
    ]
}

这里有什么问题?将 CDH 5.15.0 与 Parcels 和 Spark2 一起使用。使用 Scala 没有问题。

跟进

我将部署模式从 设置clusterclient。KeyError 消失了,但是当我尝试运行一个简单的程序时,sc.versionInterpreter died没有任何回溯或错误。

4

1 回答 1

0

我遇到了同样的问题并通过升级到 Livy 0.5.0 解决了它。

显然,CDH 5.15.0 修复了一个安全漏洞 (CVE-2018-1334),该漏洞引入了与 Livy <0.5.0 的不兼容。感谢 Marcelo Vanzin 将其发布在livy-user 邮件列表档案中。

于 2018-10-13T11:37:50.107 回答