我试图使用 sqoop 将表从 MySQL 导入 HDFS。使用的命令行是,
sqoop import --connect jdbc:mysql://192.168.10.452/qw_key_test --username qw -P --split-by qw_id -m 10 --target-dir /user/perf/qwperf/sqoops --verbose --table qw_perf_store_key
映射器失败,Unsupported version
如下所示。
2013-05-22 17:46:24,165 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2013-05-22 17:46:24,534 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
2013-05-22 17:46:24,535 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2013-05-22 17:46:24,835 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2013-05-22 17:46:24,839 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@191410e5
2013-05-22 17:46:25,278 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-05-22 17:46:25,280 FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.UnsupportedClassVersionError: qw_perf_store_key : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1510)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1475)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1569)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getInputClass(DBConfiguration.java:276)
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.createDBRecordReader(DataDrivenDBInputFormat.java:230)
at org.apache.sqoop.mapreduce.db.DBInputFormat.createRecordReader(DBInputFormat.java:236)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:617)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
任何的想法?使用 CentOS 6.2,CDH4 JDK7_21,sqoop:Sqoop 1.4.1-cdh4.1.3 (git commit id 0ff32b245c1d4c75e9f034414ebd8cfcdd140e7e)