我有两台机器 A 和 B。我试图在机器 A 上运行 Spark Master,在机器 B 上运行 Spark Worker。我conf/slaves
在我的 Spark 目录中设置了机器 B 的主机名。
当我执行start-all.sh
启动 master 和 worker 时,我在控制台上收到以下消息:
abc@abc-vostro:~/spark-scala-2.10$ sudo sh bin/start-all.sh
sudo: /etc/sudoers.d is world writable
starting spark.deploy.master.Master, logging to /home/abc/spark-scala-2.10/bin/../logs/spark-root-spark.deploy.master.Master-1-abc-vostro.out
13/09/11 14:54:29 WARN spark.Utils: Your hostname, abc-vostro resolves to a loopback address: 127.0.1.1; using 1XY.1XY.Y.Y instead (on interface wlan2)
13/09/11 14:54:29 WARN spark.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Master IP: abc-vostro
cd /home/abc/spark-scala-2.10/bin/.. ; /home/abc/spark-scala-2.10/bin/start-slave.sh 1 spark://abc-vostro:7077
xyz@1XX.1XX.X.X's password:
xyz@1XX.1XX.X.X: bash: line 0: cd: /home/abc/spark-scala-2.10/bin/..: No such file or directory
xyz@1XX.1XX.X.X: bash: /home/abc/spark-scala-2.10/bin/start-slave.sh: No such file or directory
Master 已启动,但 worker 无法启动。
我已经在我的 Spark 目录中设置xyz@1XX.1XX.X.X
了conf/slaves
。
谁能帮我解决这个问题?这可能是我缺少任何配置的东西。
但是,当我在同一台机器上创建 Spark Master 和 Worker 时,它工作正常。