Hadoop序列文件真的很奇怪。我将图像打包成序列文件并且无法恢复图像。我做了一些简单的测试。而且我发现使用序列文件之前和之后的字节大小甚至不一样。
Configuration confHadoop = new Configuration();
FileSystem fs = FileSystem.get(confHadoop);
String fileName = args[0];
Path file = new Path(fs.getUri().toString() + "/" + fileName);
Path seqFile = new Path("/temp.seq");
SequenceFile.Writer writer = null;
FSDataInputStream in = null;
try{
writer = SequenceFile.createWriter(confHadoop,Writer.file(seqFile), Writer.keyClass(Text.class),
Writer.valueClass(BytesWritable.class));
in = fs.open(file);
byte buffer[] = IOUtils.toByteArray(in);
System.out.println("original size ----> " + String.valueOf(buffer.length));
writer.append(new Text(fileName), new BytesWritable(buffer));
System.out.println(calculateMd5(buffer));
writer.close();
}finally{
IOUtils.closeQuietly(in);
}
SequenceFile.Reader reader = new SequenceFile.Reader(confHadoop, Reader.file(seqFile));
Text key = new Text();
BytesWritable val = new BytesWritable();
while (reader.next(key, val)) {
System.out.println("size get from sequence file --->" + String.valueOf(val.getLength()));
String md5 = calculateMd5(val.getBytes());
Path readSeq=new Path("/write back.png");
FSDataOutputStream out = null;
out = fs.create(readSeq);
out.write(val.getBytes()); //YES! GOT THE ORIGIANL IAMGE
out.close();
System.out.println(md5);
.............
}
输出显示我得到了相同的字节数,并且在我将图像写回本地磁盘后,我确定我得到了原始图像。但是为什么MD5值不一样呢?
我在这里做错了什么?
14/04/22 16:21:35 INFO compress.CodecPool: Got brand-new compressor [.deflate]
original size ----> 485709
c413e36fd864b27d4c8927956298edbb
14/04/22 16:21:35 INFO compress.CodecPool: Got brand-new decompressor [.deflate]
size get from sequence file --->485709
322cce20b732126bcb8876c4fcd925cb