1

我在 Mesos 0.21.0 上使用 Hadoop 2.3.0 设置 spark。当我在 master 上尝试 spark 时,我从 mesos slave 的 stderr 收到这些错误消息:

警告:在将 InitGoogleLogging() 写入 STDERR 之前进行日志记录

I1229 12:34:45.923665 8571 fetcher.cpp:76] 获取 URI 'hdfs://10.170.207.41/spark/spark-1.2.0.tar.gz'

I1229 12:34:45.925240 8571 fetcher.cpp:105] 将资源从“hdfs://10.170.207.41/spark/spark-1.2.0.tar.gz”下载到“/tmp/mesos/slaves/20141226-161203-” 701475338-5050-6942-S0/frameworks/20141229-111020-701475338-5050-985-0001/executors/20141226-161203-701475338-5050-6942-S0/runs/8ef30-e72-d8cf-ccdf-7622 1.2.0.tar.gz'

E1229 12:34:45.927089 8571 fetcher.cpp:109] HDFS copyToLocal 失败:hadoop fs -copyToLocal 'hdfs://10.170.207.41/spark/spark-1.2.0.tar.gz' '/tmp/mesos/slaves/ 20141226-161203-701475338-5050-6942-S0/FRAMEWWORKS/2014129-111020-701475338-5050-505-0001/executtors/executtors/201412261226-161203-701203-70147538-698-6942-SW38-7538-6942-STEREM 70142-STER.7538-6942-SER 7010-6942-ER ENSTERON bccdf673b5aa/spark-1.2.0.tar.gz'

sh:1:hadoop:未找到

获取失败:hdfs://10.170.207.41/spark/spark-1.2.0.tar.gz

与从机同步失败(可能已退出)

有趣的是,当我切换到从节点并运行相同的命令时

hadoop fs -copyToLocal 'hdfs://10.170.207.41/spark/spark-1.2.0.tar.gz' '/tmp/mesos/slaves/20141226-161203-701475338-5050-6942-S0/frameworks/20141229-111020 -701475338-5050-985-0001/executors/20141226-161203-701475338-5050-6942-S0/runs/8ef30e72-d8cf-4218-8a62-bccdf673b5aa/spark-1.2.0.tar.gz'

,一切顺利。

4

1 回答 1

0

启动 mesos slave 时,您必须通过以下参数指定 hadoop 安装的路径:

--hadoop_home=/path/to/hadoop

没有它,它对我不起作用,即使我设置了 HADOOP_HOME 环境变量。

于 2015-04-26T07:31:22.033 回答