如何在 Hadoop 中将多个序列文件合并为一个序列文件 谢谢。
4 回答
如果要将多个文件合并为单个文件,那么这里有两个 ans :
母语
getmerge
用法:hadoop fs -getmerge <src> <localdst>
将源目录和目标文件作为输入,并将 src 中的文件连接到目标本地文件中。可选地 addnl 可以设置为在每个文件的末尾添加换行符。
Java API
org.apache.hadoop.fs.FileUtil.copyMerge(FileSystem srcFS, Path srcDir, FileSystem dstFS, Path dstFile, boolean deleteSource, Configuration conf, String addString);
将目录中的所有文件复制到一个输出文件(合并)
复制到hdfs
put
用法:hadoop dfs -put <localsrc> ... <dst>
将单个 src 或多个 src 从本地文件系统复制到目标文件系统。还从标准输入读取输入并写入目标文件系统。
copyFromLocal
用法:hadoop dfs -copyFromLocal <localsrc> URI
类似于 put 命令,只是源被限制为本地文件引用。
你考虑过forqlift吗?我编写它是为了处理某些 SequenceFile 杂务,包括 SequenceFile 合并。
在您的情况下,您可以运行:
forqlift seq2seq --file new_combined_file.seq \
original_file1.seq original_file2.seq original_file3.seq ...
诚然,forqlift 的seq2seq
工具被标记为“实验性”......但它在我的(诚然有限的)内部测试中运行良好。
如果您正在处理大量的序列文件,我建议您编写一个 MapReduce 作业,将Mapper
用作您的映射器和Reducer
减速器。对于 i/o 格式,使用SequenceFileInputFormat
和SequenceFileOutputFormat
。将 reducer 的数量设置为 1。这些都是您在驱动程序/主代码中的 Configuration 和 Job 对象中设置的所有内容。了解如何设置输出格式,如何设置输入格式,如何设置映射器,以及如何设置化简器。
请注意,Mapper
and的默认行为Reducer
是不对数据做任何事情——只是传递它。这就是为什么你不在这里编写 map 函数或 reduce 函数的原因。
这将做的是加载您的序列文件,对映射器中的数据不做任何事情,将所有记录洗牌到减速器,然后将它们全部输出到一个文件中。这确实具有对输出序列文件中的键进行排序的副作用。
您不能将hadoop getmerge用于序列文件,因为它会将它们合并为二进制文件,而不是序列文件(因此您将在合并文件中获得很多标题,..)。
因此,您可以像@Donald-miner 建议的那样使用单个 reducer 编写一个小的 hadoop 作业,或者使用SequenceFile.Reader
and编写一个独立的合并SeuquenceFile.Writer
。
我选择了第二个选项,这是我的代码:
package ru.mail.go.webbase.markov.hadoop.utils;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.io.SequenceFile;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.util.ReflectionUtils;
public class SequenceFilesUtils {
private static final Configuration conf = HBaseConfiguration.create();
public static <K, V> void merge(Path fromDirectory, Path toFile, Class<K> keyClass, Class<V> valueClass) throws IOException {
FileSystem fs = FileSystem.get(conf);
if (!fs.isDirectory(fromDirectory)) {
throw new IllegalArgumentException("'" + fromDirectory.toString() + "' is not a directory");
}
SequenceFile.Writer writer = SequenceFile.createWriter(
conf,
SequenceFile.Writer.file(toFile),
SequenceFile.Writer.keyClass(keyClass),
SequenceFile.Writer.valueClass(valueClass)
);
for (FileStatus status : fs.listStatus(fromDirectory)) {
if (status.isDirectory()) {
System.out.println("Skip directory '" + status.getPath().getName() + "'");
continue;
}
Path file = status.getPath();
if (file.getName().startsWith("_")) {
System.out.println("Skip \"_\"-file '" + file.getName() + "'"); //There are files such "_SUCCESS"-named in jobs' ouput folders
continue;
}
System.out.println("Merging '" + file.getName() + "'");
SequenceFile.Reader reader = new SequenceFile.Reader(conf, SequenceFile.Reader.file(file));
Writable key = (Writable) ReflectionUtils.newInstance(reader.getKeyClass(), conf);
Writable value = (Writable) ReflectionUtils.newInstance(reader.getValueClass(), conf);
while (reader.next(key, value)) {
writer.append(key, value);
}
reader.close();
}
writer.close();
}
}
这是我的测试:
public class SequenceFilesUtilsTest {
private static final String OUT_PATH = "./UNIVERSE/SequenceFilesUtilsTest/";
@Before
public void initEnviroment() throws IOException {
TestUtils.createDirectory(OUT_PATH);
TestUtils.createDirectory(OUT_PATH + "/in");
}
@Test
public void test() throws Exception {
Configuration conf = HBaseConfiguration.create();
Path inPath1 = new Path("file://" + new File(OUT_PATH).getAbsolutePath() + "/in/in1.seq");
System.out.println("Saving first part to '" + inPath1 + "'");
SequenceFile.Writer writer1 = SequenceFile.createWriter(
conf,
SequenceFile.Writer.file(inPath1),
SequenceFile.Writer.keyClass(LongWritable.class),
SequenceFile.Writer.valueClass(Text.class)
);
writer1.append(new LongWritable(101), new Text("FIRST1"));
writer1.append(new LongWritable(102), new Text("FIRST2"));
writer1.append(new LongWritable(103), new Text("FIRST3"));
writer1.append(new LongWritable(104), new Text("FIRST4"));
writer1.close();
Path inPath2 = new Path("file://" + new File(OUT_PATH).getAbsolutePath() + "/in/in2.seq");
System.out.println("Saving second part to '" + inPath2 + "'");
SequenceFile.Writer writer2 = SequenceFile.createWriter(
conf,
SequenceFile.Writer.file(inPath2),
SequenceFile.Writer.keyClass(LongWritable.class),
SequenceFile.Writer.valueClass(Text.class)
);
writer2.append(new LongWritable(201), new Text("SND1"));
writer2.append(new LongWritable(202), new Text("SND2"));
writer2.append(new LongWritable(203), new Text("SND3"));
writer2.close();
SequenceFilesUtils.merge(
new Path("file://" + new File(OUT_PATH).getAbsolutePath() + "/in"),
new Path("file://" + new File(OUT_PATH).getAbsolutePath() + "/merged.seq"),
LongWritable.class,
Text.class);
Path mergedPath = new Path("file://" + new File(OUT_PATH).getAbsolutePath() + "/merged.seq");
SequenceFile.Reader reader = new SequenceFile.Reader(conf, SequenceFile.Reader.file(mergedPath));
LongWritable key = (LongWritable) ReflectionUtils.newInstance(reader.getKeyClass(), conf);
Text value = (Text) ReflectionUtils.newInstance(reader.getValueClass(), conf);
reader.next(key, value);
Assert.assertEquals(101, key.get());
Assert.assertEquals("FIRST1", value.toString());
reader.next(key, value);
Assert.assertEquals(102, key.get());
Assert.assertEquals("FIRST2", value.toString());
reader.next(key, value);
Assert.assertEquals(103, key.get());
Assert.assertEquals("FIRST3", value.toString());
reader.next(key, value);
Assert.assertEquals(104, key.get());
Assert.assertEquals("FIRST4", value.toString());
reader.next(key, value);
Assert.assertEquals(201, key.get());
Assert.assertEquals("SND1", value.toString());
reader.next(key, value);
Assert.assertEquals(202, key.get());
Assert.assertEquals("SND2", value.toString());
reader.next(key, value);
Assert.assertEquals(203, key.get());
Assert.assertEquals("SND3", value.toString());
reader.close();
}
}