0

我正在尝试在 hadoop 2.0 中读取一个序列文件,但我无法实现它。我正在使用以下代码,它在 hadoop 1.0 中运行良好。如果我遗漏了 2.0 版的内容,请告诉我

Configuration conf = new Configuration();
try {
FileSystem fs = FileSystem.get(conf);
Path p = new Path("/Users/xxx/git/xxx/src/test/cntr-20140527104344-r-00172");
SequenceFile.Reader reader = new SequenceFile.Reader(fs,p,conf);
Writable key = (Writable) ReflectionUtils.newInstance(reader.getKeyClass(), conf);
Writable value = (Writable) ReflectionUtils.newInstance(reader.getValueClass(), conf);

我在尝试调试时收到以下错误。

2014-05-28 23:30:31,567 WARN  util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(52)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-05-28 23:30:31,572 INFO  compress.CodecPool (CodecPool.java:getDecompressor(121)) - Got brand-new decompressor
java.io.EOFException
    at java.util.zip.GZIPInputStream.readUByte(GZIPInputStream.java:264)
    at java.util.zip.GZIPInputStream.readUShort(GZIPInputStream.java:254)
    at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:163)
    at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:78)
    at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:90)
    at org.apache.hadoop.io.compress.GzipCodec$GzipInputStream$ResetableGZIPInputStream.<init>(GzipCodec.java:92)
    at org.apache.hadoop.io.compress.GzipCodec$GzipInputStream.<init>(GzipCodec.java:101)
    at org.apache.hadoop.io.compress.GzipCodec.createInputStream(GzipCodec.java:169)
    at org.apache.hadoop.io.compress.GzipCodec.createInputStream(GzipCodec.java:179)
    at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1520)
    at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1428)
    at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1417)
    at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1412)
    at com.xxx.bis.social.feedbehavior.cdl.Debuger.testSpliter(Debuger.java:30)

请帮忙。

注意:使用 Hadoop 2.0 API 读写序列文件 我参考了这个链接。但它没有用。

4

1 回答 1

0

您的系统上似乎有错误的 Hadoop 类。看看这个 SO question

于 2014-05-29T08:09:42.760 回答