0

在 Ubuntu 上,使用 couchbase 2.5.1、cloudera cdh4、用于 couchbase 和 oracle jdk 6 的 hadoop 插件。一切都安装得很好(似乎),我可以独立使用 hadoop 和 couchbase 没有问题,但是当我尝试使用插件时,如下所示

sqoop import --connect http://127.0.0.1:8091/ --table DUMP

我收到以下错误

Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/04/11 11:44:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.6.0
14/04/11 11:44:08 INFO tool.CodeGenTool: Beginning code generation
14/04/11 11:44:08 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce
Note: /tmp/sqoop-vagrant/compile/30e6774902d338663db059706cde5b12/DUMP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/04/11 11:44:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-vagrant/compile/30e6774902d338663db059706cde5b12/DUMP.jar
14/04/11 11:44:09 INFO mapreduce.ImportJobBase: Beginning import of DUMP
14/04/11 11:44:09 WARN util.Jars: No such class couchbase doesn't use a jdbc driver available.
14/04/11 11:44:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/11 11:44:12 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/04/11 11:44:13 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

知道我哪里出错了吗?或者我能做些什么来找出答案?

4

2 回答 2

1

我认为您无法使用 couchbase hadoop 插件连接到带有密码的 couchbase 存储桶。我曾经获得身份验证异常并且永远无法解决它。我编辑了源代码,然后我就可以让它工作了。

于 2014-12-04T12:43:10.080 回答
1

看来我使用的语法是错误的。假设我们要将beer-sample存储桶从 couchbase 导入 HDFS,正确的语法如下,其中存储桶名称实际上是作为username.

sqoop import --connect http://localhost:8091/pools --password password --username beer-sample --table DUMP
于 2014-04-18T13:13:12.083 回答