0

向 yarn-cluster 提交 spark 作业时类似于“Bad replacement”

将作业提交到纱线集群时,我得到以下信息

2016-02-25 19:49:11,029  INFO [Remote-akka.actor.default-dispatcher-4] (org.apache.spark.deploy.yarn.Client) - Application report for application_1456408114938_0007 (state: ACCEPTED)
2016-02-25 19:49:12,034  INFO [Remote-akka.actor.default-dispatcher-4] (org.apache.spark.deploy.yarn.Client) - Application report for application_1456408114938_0007 (state: ACCEPTED)
2016-02-25 19:49:13,039  INFO [Remote-akka.actor.default-dispatcher-4] (org.apache.spark.deploy.yarn.Client) - Application report for application_1456408114938_0007 (state: FAILED)
2016-02-25 19:49:13,040  INFO [Remote-akka.actor.default-dispatcher-4] (org.apache.spark.deploy.yarn.Client) -
         client token: N/A
         diagnostics: Application application_1456408114938_0007 failed 2 times due to AM Container for appattempt_1456408114938_0007_000002 exited with  exitCode: 1
For more detailed output, check application tracking page:http://m:8088/cluster/app/application_1456408114938_0007Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e03_1456408114938_0007_02_000001
Exit code: 1
Exception message: /hadoop/yarn/local/usercache/spark-notebook/appcache/application_1456408114938_0007/container_e03_1456408114938_0007_02_000001/launch_container.sh: line 24: $PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution

Stack trace: ExitCodeException exitCode=1: /hadoop/yarn/local/usercache/spark-notebook/appcache/application_1456408114938_0007/container_e03_1456408114938_0007_02_000001/launch_container.sh: line 24: $PWD:$PWD/__spark_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution

以下作品: - 齐柏林飞艇作品

和 SparkPi 示例工作

set MASTER=yarn-client
. /etc/spark/conf/spark-env.sh
./run-example SparkPi
16/02/25 19:54:38 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:36, took 1.458580 s

Pi 大约是 3.14232

4

0 回答 0