1

我已经在 12 个节点上安装了 spark2.0.0(在集群独立模式下),当我启动它时,我得到了这个:

./sbin/start-all.sh

启动 org.apache.spark.deploy.master.Master,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.master。 Master-1-ibnb25.out

localhost192.17.0.17:ssh:无法解析主机名 localhost192.17.0.17:名称或服务未知

192.17.0.20:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb28.out

192.17.0.21:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb29.out

192.17.0.19:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb27.out

192.17.0.18:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb26.out

192.17.0.24:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb32.out

192.17.0.22:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb30.out

192.17.0.25:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb33.out

192.17.0.28:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb36.out

192.17.0.27:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb35.out

192.17.0.17:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb25.out

192.17.0.26:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb34.out

192.17.0.23:启动 org.apache.spark.deploy.worker.Worker,登录到 /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark。 deploy.worker.Worker-1-ibnb31.out

我已经设置了端口 o 主端口 = 8081,其 IP = 192.17.0.17 表示 HOSTNAME = ibnb25,我从该主机启动了集群。

从我的本地机器我使用这个命令来访问集群

 ssh mName@xx.xx.xx.xx 

当我想从本地机器访问 Web UI 时,我使用了主服务器的 IP 地址(HOST ibnb25)

192.17.0.17:8081

但它无法显示,所以我尝试使用我用来访问集群的地址

xx.xx.xx.xx:8081

但我的浏览器上没有显示任何内容.....出了什么问题??请帮帮我

4

1 回答 1

0

Your /etc/hosts file seems to be incorrectly set up.

You should get hostname and IP with following commands:

hostname
hostname -i

Make sure there is space between hostname and IP.

Sample /etc/hosts file looks like :

192.17.0.17  <hostname>
192.17.0.17  localhost
<Other IP1>  <other hostname1>
.
.
.
<Other IP-n>  <other hostname-n>

Make sure to have all IP host entries in cluster on each node in /etc/hosts file.

For FQDN read this.

于 2016-08-30T12:36:49.210 回答