0

当我在伪集群模式下使用 HBase 时,出现以下异常。如果有人能对这个问题有所了解以解决它,那就太好了

org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions:
Wed Feb 06 15:22:23 IST 2013, org.apache.hadoop.hbase.client.ScannerCallable@29422384, java.io.IOException: java.io.IOException: Could not iterate StoreFileScanner[HFileScanner for reader reader=file:/home/688697/hbase/test/c28d92322c97364af59b09d4f4b4a95f/cf/c5de203afb5647c0b90c6c18d58319e9, compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false], firstKey=0deptempname0/cf:email/1360143938898/Put, lastKey=4191151deptempname4191151/cf:place/1360143938898/Put, avgKeyLen=45, avgValueLen=7, entries=17860666, length=1093021429, cur=10275517deptempname10275517/cf:place/1360143938898/Put/vlen=4]
    at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
    at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3004)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2951)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2968)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2155)
    at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/home/688697/hbase/test/c28d92322c97364af59b09d4f4b4a95f/cf/c5de203afb5647c0b90c6c18d58319e9 at 37837312 exp: -819174049 got: 1765448374
    at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
    at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
    at org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
    at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
    at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
    at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
    at org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
    at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:48)
    at org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
    at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
    ... 12 more

Wed Feb 06 15:22:24 IST 2013, org.apache.hadoop.hbase.client.ScannerCallable@29422384, java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException
    at org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1079)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1068)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2182)
    at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
    at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345)
Caused by: java.lang.IllegalArgumentException
    at java.nio.Buffer.position(Buffer.java:216)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
    at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
    at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
    at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3004)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2951)
    at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2968)
    at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2155)
    ... 5 more
4

1 回答 1

0

此问题的根本原因在于您的 /etc/hosts 文件。如果你检查你的 /etc/hosts 文件,你会发现一个类似下面的条目(在我的例子中,mu 机器被命名为 domainnameyouwanttogive)

127.0.0.1 localhost
127.0.1.1 domainnameyouwanttogive

以下行对于支持 IPv6 的主机是可取的

::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

根本原因是 domainnameyouwanttogive 解析为 127.0.1.1 这是不正确的,因为它应该解析为 127.0.0.1(或外部 IP)。由于我的外部 IP 是 192.168.58.10 我创建了以下 /etc/hosts 配置;

127.0.0.1 localhost
192.168.43.3 domainnameyouwanttogive

以下行对于支持 IPv6 的主机是可取的

::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

这将确保您的 localhost 上的主机进程解析将正确完成,并且您可以在开发系统上正确启动 HBase 安装。

并且请确保您的 hadoop namenode 使用与您用于 hbase 的域名相同的域名运行

于 2013-03-14T17:38:32.773 回答