1

我是 sqoop 的新手,并尝试从 hadoopguide 数据库中导入 MYSQL 表小部件表中的表。

我正在使用 Hadoop 0.20 版。

我的 Sqoop 是 sqoop-1.4.4.bin__hadoop-0.20

我正在运行命令:

sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1

这是我得到的错误日志

Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13/09/25 15:29:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming     resultset.
13/09/25 15:29:41 INFO tool.CodeGenTool: Beginning code generation
13/09/25 15:29:41 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM     `widgets` AS t LIMIT 1
13/09/25 15:29:41 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `widgets` AS t LIMIT 1
13/09/25 15:29:41 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
13/09/25 15:29:41 INFO orm.CompilationManager: Found hadoop core jar at: /usr/local/hadoop/hadoop-0.20.2-core.jar
Note: /tmp/sqoop-ubuntu/compile/348861f092b25aac3fae4089da9abdf0/widgets.java uses or     overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/09/25 15:29:42 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-    ubuntu/compile/348861f092b25aac3fae4089da9abdf0/widgets.jar
13/09/25 15:29:42 WARN manager.MySQLManager: It looks like you are importing from mysql.
13/09/25 15:29:42 WARN manager.MySQLManager: This transfer can be faster! Use the --    direct
13/09/25 15:29:42 WARN manager.MySQLManager: option to exercise a MySQL-specific fast     path.
13/09/25 15:29:42 INFO manager.MySQLManager: Setting zero DATETIME behavior to     convertToNull (mysql)
13/09/25 15:29:42 INFO mapreduce.ImportJobBase: Beginning import of widgets
Exception in thread "main" java.lang.NoSuchMethodError:     org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/security/Credentials;
    at     org.apache.sqoop.mapreduce.db.DBConfiguration.getPassword(DBConfiguration.java:304)
    at     org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:272)
    at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)
    at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:882)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
    at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:239)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

任何人都可以对此有任何想法。

4

2 回答 2

1

如果你已经安装了 hive,hcatalog 会随它一起安装。现在设置HCAT_HOME如下.bashrc

cd ~
gedit .bashrc
   export HCAT_HOME=${HIVE_HOME}/hcatalog/
   export PATH=$HCAT_HOME/bin:$PATH

source .bashrc //to refresh the .bashrc file

否则单独安装 hcatalog 并设置主路径。

于 2015-07-05T05:44:48.957 回答
0

Hadoop 0.20 版是一个非常老的版本,缺少很多特性。Sqoop 需要的一项功能是在 1.x 中添加的安全附加功能。因此 Sqoop 无法在裸机 0.20 上运行,至少需要 CDH3u1 或 Hadoop 1.x。我强烈建议升级您的 Hadoop 集群。

于 2013-09-25T17:30:32.480 回答