0

如果我有这样的代码:

Configuration conf = new Configuration();
LoadIncrementalHFiles lihf;
lihf = new LoadIncrementalHFiles(conf);
lihf.doBulkLoad(/*proper args*/)          

这适用于命令行中的 -Djava.library.path=/usr/lib/hadoop/lib/native/ 。该表是 Snappy 压缩的

我需要一个 uber-jar,因此使用 maven-shade-plugin 来创建它。

现在,即使我在命令行中指定 -Djava.library.path=/usr/lib/hadoop/lib/native/,相同的代码也不起作用

错误是:

java.lang.IllegalStateException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:382)
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:258)
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
    at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
    at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:185)
    at org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:131)
    at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getDecompressor(Compression.java:331)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.decompress(HFileBlock.java:1457)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1963)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1703)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1350)
    at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1358)
    at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.<init>(HFileReaderV2.java:127)
    at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:552)
    at org.apache.hadoop.hbase.io.hfile.HFile.createReaderWithEncoding(HFile.java:589)
    at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:636)
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:440)
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:361)
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:359)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
4

1 回答 1

0

它是一个类路径问题。

旧命令运行(失败):

java -Djava.library.path=/usr/lib/hadoop/lib/native -cp jarname.jar:`hbase classpath` com.className

要运行的新命令(成功):

java -Djava.library.path=/usr/lib/hadoop/lib/native -cp `hbase classpath`:jarname.jar com.className
于 2014-08-06T15:57:46.913 回答