我正在尝试创建一个 MR 作业,它将更改通过 Flume 加载到 HDFS 的日志文件的格式。我正在尝试将日志转换为字段由“:::”分隔的格式。例如
date/timestamp:::log-level:::rest-of-log
我遇到的问题是一些日志是单行的,而另一些是多行的,我需要在其余的日志字段中保持多行日志的完整性。我已经编写了一个自定义InputFormat
并RecordReader
尝试执行此操作(基本上只是NLineRecordReader
修改为附加行直到它达到日期戳,而不是附加固定数量的行)。我用来格式化日志的 MR 工作似乎工作正常,但RecordReader
似乎不能正常工作以传递多行,我不知道为什么。
这是我的 RecordReader 类:
public class LogRecordReader extends RecordReader<LongWritable, Text> {
private LineReader in;
private LongWritable key;
private Text value = new Text();
private long start = 0;
private long end = 0;
private long pos = 0;
private int maxLineLength;
private Text line = new Text(); // working line
private Text lineHasDate = new Text(); // if line encounters a date stamp, hold it here
public void close() throws IOException {
if (in != null) {
in.close();
}
}
public LongWritable getCurrentKey() throws IOException,InterruptedException {
return key;
}
public Text getCurrentValue() throws IOException, InterruptedException {
return value;
}
public float getProgress() throws IOException, InterruptedException {
if (start == end) {
return 0.0f;
}
else {
return Math.min(1.0f, (pos - start) / (float)(end - start));
}
}
public void initialize(InputSplit genericSplit, TaskAttemptContext context) throws IOException, InterruptedException {
FileSplit split = (FileSplit) genericSplit;
final Path file = split.getPath();
Configuration conf = context.getConfiguration();
this.maxLineLength = conf.getInt("mapred.linerecordreader.maxlength",Integer.MAX_VALUE);
FileSystem fs = file.getFileSystem(conf);
start = split.getStart();
end = start + split.getLength();
boolean skipFirstLine = false;
FSDataInputStream filein = fs.open(split.getPath());
// if we're not starting at the beginning, we should skip the first line
if (start != 0){
skipFirstLine = true;
--start;
filein.seek(start);
}
in = new LineReader(filein, conf);
// if we should skip the first line
if(skipFirstLine){
start += in.readLine(new Text(), 0, (int)Math.min((long)Integer.MAX_VALUE, end - start));
}
this.pos = start;
}
/**
* create a complete log message from individual lines using date/time stamp as a breakpoint
*/
public boolean nextKeyValue() throws IOException, InterruptedException {
// if key has not yet been initialized
if (key == null) {
key = new LongWritable();
}
key.set(pos);
// if value has not yet been initialized
if (value == null) {
value = new Text();
}
value.clear();
final Text endline = new Text("\n");
int newSize = 0;
// if a line with a date was encountered on the previous call
if (lineHasDate.getLength() > 0) {
while (pos < end) {
value.append(lineHasDate.getBytes(), 0, lineHasDate.getLength()); // append the line
value.append(endline.getBytes(), 0, endline.getLength()); // append a line break
pos += newSize;
if (newSize == 0) break;
}
lineHasDate.clear(); // clean up
}
// to check buffer 'line' for date/time stamp
Pattern regexDateTime = Pattern.compile("^\\d{2}\\s\\S+\\s\\d{4}\\s\\d{2}:\\d{2}:\\d{2},\\d{3}\\s");
Matcher matcherDateTime = regexDateTime.matcher(line.toString());
// read in a new line to the buffer 'line'
newSize = in.readLine(line, maxLineLength, Math.max((int)Math.min(Integer.MAX_VALUE, end-pos), maxLineLength));
// if the line in the buffer contains a date/time stamp, append it
if (matcherDateTime.find()) {
while (pos < end) {
newSize = in.readLine(line, maxLineLength, Math.max((int)Math.min(Integer.MAX_VALUE, end-pos), maxLineLength));
value.append(line.getBytes(), 0, line.getLength()); // append the line
value.append(endline.getBytes(), 0, endline.getLength()); // append a line break
if (newSize == 0) break;
pos += newSize;
if (newSize < maxLineLength) break;
}
// read in the next line to the buffer 'line'
newSize = in.readLine(line, maxLineLength, Math.max((int)Math.min(Integer.MAX_VALUE, end-pos), maxLineLength));
}
// while lines in the buffer do not contain date/time stamps, append them
while(!matcherDateTime.find()) {
newSize = in.readLine(line, maxLineLength, Math.max((int)Math.min(Integer.MAX_VALUE, end-pos), maxLineLength));
value.append(line.getBytes(), 0, line.getLength()); // append the line
value.append(endline.getBytes(), 0, endline.getLength()); // append a line break
if (newSize == 0) break;
pos += newSize;
if (newSize < maxLineLength) break;
// read in the next line to the buffer 'line', and continue looping
newSize = in.readLine(line, maxLineLength, Math.max((int)Math.min(Integer.MAX_VALUE, end-pos), maxLineLength));
}
// if the line in the buffer contains a date/time stamp (which it should since the loop broke) save it for next call
if (matcherDateTime.find()) lineHasDate = line;
// if there is no new line
if (newSize == 0) {
// TODO: if lineHasDate is the last line in the file, it must be appended (?)
key = null;
value = null;
return false;
}
return true;
}
}
这是用于格式化日志的 MR 作业:
public class FlumeLogFormat extends Configured implements Tool {
/**
* Map class
*/
public static class Map extends MapReduceBase
implements Mapper<LongWritable, Text, Text, Text> {
private Text formattedLog = new Text();
private Text keyDateTime = new Text(); // no value
public void map(LongWritable key, Text value,
OutputCollector<Text, Text> output, Reporter reporter) throws IOException {
String log = value.toString();
StringBuffer buffer = new StringBuffer();
Pattern regex = Pattern.compile("^(\\d{2}\\s\\S+\\s\\d{4}\\s\\d{2}:\\d{2}:\\d{2},\\d{3})\\s([A-Z]{4,5})\\s([\\s\\S]+)");
Matcher matcher = regex.matcher(log);
if (matcher.find()) {
buffer.append(matcher.group(1)+":::"+matcher.group(2)+":::"+matcher.group(3)); // insert ":::" between fields to serve as a delimiter
formattedLog.set(buffer.toString());
keyDateTime.set(matcher.group(1));
output.collect(keyDateTime, formattedLog);
}
}
}
/**
* run method
* @param args
* @return int
* @throws Exception
*/
public int run(String[] args) throws Exception {
JobConf conf = new JobConf(getConf(), FlumeLogFormat.class); // class is LogFormat
conf.setJobName("FlumeLogFormat");
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(Text.class);
conf.setMapperClass(Map.class);
List<String> other_args = new ArrayList<String>();
for(int i=0; i < args.length; ++i) {
try {
if ("-m".equals(args[i])) {
conf.setNumMapTasks(Integer.parseInt(args[++i]));
} else if ("-r".equals(args[i])) {
conf.setNumReduceTasks(Integer.parseInt(args[++i]));
} else {
other_args.add(args[i]);
}
} catch (NumberFormatException exception) {
System.out.println("Give int value instead of " + args[i]);
//return printUsage();
} catch (ArrayIndexOutOfBoundsException exception) {
System.out.println("Parameter missing " + args[i-1]);
//return printUsage();
}
}
if (other_args.size() != 2) {
//return printUsage();
}
FileInputFormat.setInputPaths(conf, new Path(other_args.get(0)));
FileOutputFormat.setOutputPath(conf, new Path(other_args.get(1)));
conf.setInputFormat(LogInputFormat.class);
conf.setOutputFormat(SequenceFileOutputFormat.class);
JobClient.runJob(conf);
return 0;
}
/**
* Main method
* @param args
* @throws Exception
*/
public static void main(String[] args) throws Exception {
int res = ToolRunner.run(new Configuration(), new FlumeLogFormat(), args);
System.exit(res);
}
}
以下是日志:
21 July 2013 17:35:51,334 INFO [conf-file-poller-0] (org.apache.flume.node.Application.startAllComponents:173) - Starting Sink k1
25 May 2013 06:33:36,795 ERROR [lifecycleSupervisor-1-7] (org.apache.flume.lifecycle.LifecycleSupervisor$MonitorRunnable.run:253) - Unable to start EventDrivenSourceRunner: { source:org.apache.flume.source.SpoolDirectorySource{name:r1,state:IDLE} } - Exception follows.
java.lang.IllegalStateException: Directory does not exist: /root/FlumeTest
at com.google.common.base.Preconditions.checkState(Preconditions.java:145)
at org.apache.flume.client.avro.ReliableSpoolingFileEventReader.<init>(ReliableSpoolingFileEventReader.java:129)
at org.apache.flume.client.avro.ReliableSpoolingFileEventReader.<init>(ReliableSpoolingFileEventReader.java:72)
at org.apache.flume.client.avro.ReliableSpoolingFileEventReader$Builder.build(ReliableSpoolingFileEventReader.java:556)
at org.apache.flume.source.SpoolDirectorySource.start(SpoolDirectorySource.java:75)
at org.apache.flume.source.EventDrivenSourceRunner.start(EventDrivenSourceRunner.java:44)
at org.apache.flume.lifecycle.LifecycleSupervisor$MonitorRunnable.run(LifecycleSupervisor.java:251)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:351)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:165)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:267)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
01 June 2012 12:35:22,222 INFO noiweoqierwnvoirenvoiernv iorenvoiernve irnvoirenv