0

我们正在构建一个带有 hive 和 hbase + kerberos 的 vanilla hadoop 2.7.3 集群。我们正在使用 hadoop 的 bigtop repo 来简化它。

部署脚本成功安装了 hive 和组件,但即使我们运行了 metastore 和 hiveserver,

  • 它没有使用 10000 端口,
  • 我们无法连接到直线。
  • 没有错误
  • 它甚至没有创建 hiveserver2.log 文件。

ps -ef | grep hive 显示以下输出

   hive      9043     1  2 10:57 ?        00:00:23 /usr/lib/jvm/java-openjdk/bin/java -Xmx256m -Djava.security.krb5.conf=/etc/krb5.conf -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-service-1.2.1.jar org.apache.hadoop.hive.metastore.HiveMetaStore
hive      9751     1  2 11:04 ?        00:00:11 /usr/lib/jvm/java-openjdk/bin/java -Xmx256m -Djava.security.krb5.conf=/etc/krb5.conf -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-service-1.2.1.jar org.apache.hive.service.server.HiveServer2
root     10285  7469  0 11:13 pts/1    00:00:00 grep hive

连接直线

[root@wnode55 ~]# beeline -u 'jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM'
ls: cannot access /usr/lib/spark/lib/spark-assembly-*.jar: No such file or directory
Connecting to jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM
Error: Could not open client transport with JDBC Uri: jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
Beeline version 1.2.1 by Apache Hive
0: jdbc:hive2://wnode55.domain_name.com:10000 (closed)>

蜂巢站点.xml

[root@wnode55 ~]# cat /etc/hive/conf/hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Licensed to the Apache Software Foundation (ASF) under one or more       -->
<!-- contributor license agreements.  See the NOTICE file distributed with    -->
<!-- this work for additional information regarding copyright ownership.      -->
<!-- The ASF licenses this file to You under the Apache License, Version 2.0  -->
<!-- (the "License"); you may not use this file except in compliance with     -->
<!-- the License.  You may obtain a copy of the License at                    -->
<!--                                                                          -->
<!--     http://www.apache.org/licenses/LICENSE-2.0                           -->
<!--                                                                          -->
<!-- Unless required by applicable law or agreed to in writing, software      -->
<!-- distributed under the License is distributed on an "AS IS" BASIS,        -->
<!-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -->
<!-- See the License for the specific language governing permissions and      -->
<!-- limitations under the License.                                           -->

<configuration>

<!-- Hive Configuration can either be stored in this file or in the hadoop configuration files  -->
<!-- that are implied by Hadoop setup variables.                                                -->
<!-- Aside from Hadoop setup variables - this file is provided as a convenience so that Hive    -->
<!-- users do not have to edit hadoop configuration files (that may be managed as a centralized -->
<!-- resource).                                                                                 -->

<!-- Hive Execution Parameters -->




<property>
  <name>hbase.zookeeper.quorum</name>
  <value>wnode55.domain_name.com</value>
  <description>http://wiki.apache.org/hadoop/Hive/HBaseIntegration</description>
</property>


<property>
  <name>hive.execution.engine</name>
  <value>mr</value>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby:;databaseName=/var/lib/hive/metastore/metastore_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>org.apache.derby.jdbc.EmbeddedDriver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>hive.hwi.war.file</name>
  <value>/usr/lib/hive/lib/hive-hwi.war</value>
  <description>This is the WAR file with the jsp content for Hive Web Interface</description>
</property>

<property>
   <name>hive.server2.allow.user.substitution</name>
   <value>true</value>
</property>

<property>
   <name>hive.server2.enable.doAs</name>
   <value>true</value>
</property>

<property>
   <name>hive.server2.thrift.port</name>
   <value>10000</value>
</property>

<property>
   <name>hive.server2.thrift.http.port</name>
   <value>10001</value>
</property>


<property>
   <name>hive.metastore.uris</name>
   <value>thrift:/wnode55.domain_name.com:9083</value>
</property>


<property>
   <name>hive.security.metastore.authorization.manager</name>
   <value>org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider</value>
</property>

<property>
    <name>hive.server2.authentication</name>
    <value>KERBEROS</value>
</property>
<property>
    <name>hive.server2.authentication.kerberos.principal</name>
    <value>hive/_HOST@domain_name.COM</value>
</property>
<property>
    <name>hive.server2.authentication.kerberos.keytab</name>
    <value>/etc/hadoop/conf/hive.keytab</value>
</property>
</configuration>

任何帮助将非常感激。

4

0 回答 0