0

我目前正在使用 Cloudera CDH4 VM。

一切似乎都在工作。下面是我的输出。导入声称成功,但没有写入任何记录。我已经附上了导入的输出。[

cloudera@ap00134-vip ~]$ hbase shell
12/11/26 18:53:41 WARN conf.Configuration: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 0.92.1-cdh4.1.1, rUnknown, Tue Oct 16 12:01:17 PDT 2012

hbase(main):001:0>

[cloudera@ap00134-vip ~]$ sqoop version
Sqoop 1.4.1-cdh4.1.1
git commit id b0c34454234e5246b4ef345694d7e1a5904f00fe
Compiled by jenkins on Tue Oct 16 12:17:51 PDT 2012
[cloudera@ap00134-vip ~]$

sqoop import --connect jdbc:oracle:thin:@//154.11.169.116:1521/bigdata  --table BIGDATA_SMALL_RAW --username test --hbase-create-table --hbase-table t1 --column-family cf --columns DSERVER_COMPUTER --hbase-row-key ROWKEY -m 1

12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/usr/lib/hadoop/lib/native
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-220.23.1.el6.x86_64
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:user.name=cloudera
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/cloudera
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/cloudera
12/11/26 18:41:12 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection
12/11/26 18:41:12 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)
12/11/26 18:41:12 INFO zookeeper.ClientCnxn: Socket connection established to localhost.localdomain/127.0.0.1:2181, initiating session
12/11/26 18:41:12 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost.localdomain/127.0.0.1:2181, sessionid = 0x13b2fc047340058, negotiated timeout = 40000
12/11/26 18:41:12 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 29089@ap00134-vip.osc.tac.net
12/11/26 18:41:13 WARN conf.Configuration: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
12/11/26 18:41:13 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=catalogtracker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@71257687
12/11/26 18:41:13 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost.localdomain/127.0.0.1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)
12/11/26 18:41:13 INFO zookeeper.ClientCnxn: Socket connection established to localhost.localdomain/127.0.0.1:2181, initiating session
12/11/26 18:41:13 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost.localdomain/127.0.0.1:2181, sessionid = 0x13b2fc047340059, negotiated timeout = 40000
12/11/26 18:41:13 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 29089@ap00134-vip.osc.tac.net
12/11/26 18:41:13 INFO zookeeper.ClientCnxn: EventThread shut down
12/11/26 18:41:13 INFO zookeeper.ZooKeeper: Session: 0x13b2fc047340059 closed
12/11/26 18:41:13 INFO mapreduce.HBaseImportJob: Creating missing HBase table t1
12/11/26 18:41:17 INFO mapreduce.JobSubmitter: number of splits:1
12/11/26 18:41:17 WARN conf.Configuration: mapred.job.classpath.files is deprecated. Instead, use mapreduce.job.classpath.files
12/11/26 18:41:17 WARN conf.Configuration: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
12/11/26 18:41:17 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name
12/11/26 18:41:17 WARN conf.Configuration: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
12/11/26 18:41:17 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
12/11/26 18:41:18 INFO mapred.ResourceMgrDelegate: Submitted application application_1353715862141_0011 to ResourceManager at /0.0.0.0:8032
12/11/26 18:41:18 INFO mapreduce.Job: The url to track the job: http://ap00134-vip.osc.tac.net:8088/proxy/application_1353715862141_0011/
12/11/26 18:41:18 INFO mapreduce.Job: Running job: job_1353715862141_0011
12/11/26 18:41:27 INFO mapreduce.Job: Job job_1353715862141_0011 running in uber mode : false
12/11/26 18:41:27 INFO mapreduce.Job:  map 0% reduce 0%
12/11/26 18:41:50 INFO mapreduce.Job:  map 100% reduce 0%
12/11/26 18:41:50 INFO mapreduce.Job: Job job_1353715862141_0011 completed successfully
12/11/26 18:41:50 INFO mapreduce.Job: Counters: 27
        File System Counters
                FILE: Number of bytes read=120
                FILE: Number of bytes written=93711
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=87
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=1
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=0
        Job Counters
                Launched map tasks=1
                Other local map tasks=1
                Total time spent by all maps in occupied slots (ms)=182000
                Total time spent by all reduces in occupied slots (ms)=0
        Map-Reduce Framework
                Map input records=21
                Map output records=21
                Input split bytes=87
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=93
                CPU time spent (ms)=1910
                Physical memory (bytes) snapshot=140869632
                Virtual memory (bytes) snapshot=721960960
                Total committed heap usage (bytes)=126877696
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=0
12/11/26 18:41:50 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 36.6957 seconds (0 bytes/sec)
12/11/26 18:41:51 INFO mapreduce.ImportJobBase: Retrieved 21 records.

hbase(main):005:0> scan '.META.'
ROW                                                  COLUMN+CELL
 t1,,1353973273247.a173f168bb6ffabbcf78837cd3f5234b. column=info:regioninfo, timestamp=1353973273268, value={NAME => 't1,,1353973273247.a173f168bb6ffabbcf78837cd3f5234b.', STARTKEY => '', ENDKEY => '', ENCOD
                                                     ED => a173f168bb6ffabbcf78837cd3f5234b,}
 t1,,1353973273247.a173f168bb6ffabbcf78837cd3f5234b. column=info:server, timestamp=1353973273287, value=ap00134-vip.osc.tac.net:56831
 t1,,1353973273247.a173f168bb6ffabbcf78837cd3f5234b. column=info:serverstartcode, timestamp=1353973273287, value=1353715834683
1 row(s) in 0.0140 seconds

hbase(main):006:0> scan 't1'
ROW                                                  COLUMN+CELL
0 row(s) in 0.0160 seconds

hbase(main):007:0>
4

1 回答 1

1

我写了如下它对我有用

sqoop import --connect "jdbc:sqlserver://(hostname or IP);database=dbname;username=sa;password=sqlserver" --table DimCustomer --hbase-create-table --hbase-table "HbasetableName" - -column-family cf --hbase-row-key "表的主键列" -m 1

于 2013-09-10T10:28:23.363 回答