我正在尝试使用 Ambari 2.1.0 和 HDP 2.3 在 CentOS 6.6 机器的本地集群上安装 HDFS、YARN、Spark 等。我已经设法从 HDP 2.2 升级失败,所以在重新开始之前我删除了所有 HDP 2.2 包 + Ambari。我能够毫无问题地完成大部分群集安装向导,但在“安装、启动和测试”阶段,我收到以下错误消息
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 38, in <module>
AfterInstallHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 218, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 35, in hook
link_configs(self.stroutfile)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 91, in link_configs
_link_configs(k, json_version, v)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 156, in _link_configs
conf_select.select("HDP", package, version)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 241, in select
shell.checked_call(get_cmd("set-conf-dir", package, version), logoutput=False, quiet=False, sudo=True)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'conf-select set-conf-dir --package spark --stack-version 2.3.0.0-2557 --conf-version 0' returned 1. spark not installed or incorrect package name
检查脚本似乎正在/usr/hdp/2.3.0.0-2557 中寻找火花。这是我在那个目录中看到的
ls /usr/hdp/2.3.0.0-2557/
etc hadoop hadoop-hdfs hadoop-mapreduce hadoop-yarn ranger-hdfs-plugin ranger-yarn-plugin usr zookeeper
抱怨的从机之一,似乎已“安装”了spark
# yum list installed | grep spark
spark_2_3_0_0_2557.noarch
spark_2_3_0_0_2557-master.noarch
spark_2_3_0_0_2557-python.noarch
spark_2_3_0_0_2557-worker.noarch
关于如何解决这个问题的任何想法?