2

在 Ambari 上安装 HDFS 客户端时出现以下错误。已经多次重置服务器,但仍然无法解决。知道如何解决吗?

标准错误:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>
    HdfsClient().execute()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install
    self.configure(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure
    hdfs()
 File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
    group=params.user_group
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
4

4 回答 4

1
yum -y erase hdp-select

如果您已多次安装,可能无法清除某些软件包。

要删除所有 HDP 软件包并开始全新安装,请擦除 hdp-select。

如果这没有帮助,请从/usr/hdp删除此目录中删除所有版本,如果它包含多个版本hdp

删除所有已安装的软件包,例如hadoop,hdfs,zookeeper etc.

yum remove zookeeper* hadoop* hdp* zookeeper*
于 2017-09-18T06:22:45.413 回答
1

这是一个链接到/etc/hadoop/conf的软链接

我跑

python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py --silent --skip=users

运行后,它会删除/etc/hadoop/conf

但是,重新安装不会重新创建它。

因此,您可能必须自己创建所有 conf 文件。希望有人可以修补它。

于 2015-12-07T07:53:21.270 回答
0

我遇到了同样的问题:我在 Centos 7 上使用 HDP 2.3.2。

第一个问题: 一些 conf 文件指向 /etc//conf 目录(与它们应该的相同) 但是,/etc//conf 指向另一个 conf 目录,这导致了无限循环。

我能够通过删除 /etc//conf 符号链接并创建目录来解决这个问题

第二个问题 如果您运行 python 脚本来清理安装并重新开始,则不会重新创建几个目录,例如 hadoop-client 目录。这会导致准确的错误消息。此外,此清理脚本效果不佳,因为它不会清理多个用户和目录。你必须 userdel 和 groupdel。

更新:这似乎是 HDP 2.3.2 的问题。在 HDP 2.3.4 中,我不再遇到这个问题。

于 2016-01-06T08:30:53.317 回答
-1

在失败的主机上创建/usr/hdp/current/hadoop-client/conf应该可以解决问题。

于 2015-10-26T18:24:09.450 回答