1

我正在 Azure VM 中设置独立的 Spark 集群。我想使用 Azure VM 的公共 IP 而不是 VM 的主机名运行 Spark master,以便我可以从其他 VM 访问。

Spark版本:spark-2.2.0-bin-hadoop2.7

我在 conf 文件夹下创建了新文件“spark-env.sh”并添加了export SPARK_MASTER_HOST=xxxx

启动master sbin>./start-master.sh

我收到下面提到的错误。Spark master 未启动。

如何为 Spark Master 设置公共 IP 地址?

错误日志

18/04/10 04:55:12 INFO SecurityManager: Changing view acls to: root
18/04/10 04:55:12 INFO SecurityManager: Changing modify acls to: root
18/04/10 04:55:12 INFO SecurityManager: Changing view acls groups to:
18/04/10 04:55:12 INFO SecurityManager: Changing modify acls groups to:
18/04/10 04:55:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7080. Attempting port 7081.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7081. Attempting port 7082.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7082. Attempting port 7083.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7083. Attempting port 7084.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7084. Attempting port 7085.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7085. Attempting port 7086.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7086. Attempting port 7087.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7087. Attempting port 7088.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7088. Attempting port 7089.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7089. Attempting port 7090.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7090. Attempting port 7091.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
18/04/10 04:55:13 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'sparkMaster' failed after 16 retries (starting from 7077)! Consider explicitly setting the appropriate port for the service 'sparkMaster' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
4

1 回答 1

0

对于SPARK_MASTER_HOST,你应该使用VM的私有IP,或者0.0.0.0,你不能输入VM的公共IP,如果你这样做,你会得到错误日志。

现在,您想通过公共 IP 访问 Spark Master,您需要在Azure NSG和 VM 的防火墙上打开端口7077-7093(不确定,取决于您的服务) 。

于 2018-04-11T01:57:09.347 回答