0

我在本地安装了一个 cloudera docker 容器,还配置了配置单元端口,就像这样 docker run --hostname=quickstart.cloudera --privileged=true -t -i -p 8888:8888 -p 80:80 -p 10000:10000 --name cloudera2 cloudera/quickstart /usr/bin/docker-quickstart

我想把它和JDBC连接起来,我的代码是这样的,

val driver = "org.apache.hive.jdbc.HiveDriver"
val url = "jdbc:hive2://localhost:10000/default"
val username = ""
val password = ""

// there's probably a better way to do this
var connection: Connection = null

try {
  // make the connection
  Class.forName(driver)

} catch {
  case e => e.printStackTrace
}
connection = DriverManager.getConnection(url, username, password)
connection.close()

但是NoClassDefFoundError当我尝试执行它时发生

log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
    at org.apache.hive.jdbc.HiveConnection.createUnderlyingTransport(HiveConnection.java:362)
    at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:382)
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:193)
    at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:247)
    at ScalaJdbcConnectSelect$.main(ScalaJdbcConnectSelect.scala:32)

蜂巢版本:

Hive 1.1.0-cdh5.7.0

maven依赖

    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-jdbc</artifactId>
        <version>1.1.0-cdh5.7.0</version>
    </dependency>

我不确定是因为用户名和密码,但我试过了"cloudera","cloudera" "hive","", and "",""

4

1 回答 1

0

我发现它需要添加hadoop公共依赖,在我的例子中如下。

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.6.0-cdh5.7.0</version>
    </dependency>

它工作正常。

于 2017-04-25T13:44:43.937 回答