4

有没有办法将所有 apache 日志保存为 CSV 文件?

access.log->access_log.csv
error.log->error_log.csv
4

3 回答 3

3

您可以定义自定义日志格式以使 Apache 日志直接转换为逗号分隔格式。

您可能不得不摆弄一段时间才能找到正确的方法。例如,您可能希望使用"or'作为字段分隔符,以防止字段值中的逗号破坏您的 CSV。

于 2012-07-29T08:40:34.750 回答
3

如果您有问题要查看过去编写的日志文件,或者您无权访问配置文件的 apache 服务器的日志文件,或者由于其他原因不想更改日志文件格式:

我编写了一个小 linux shell sed 脚本,将默认的 apache 日志文件转换为 libre office calc 可以读取的格式:

#!/bin/bash

#reformat apache's access logs, so that they can be interpreted as csv files, 
# with space as column delimiter and double quotes to bind together things
# that contain spaces but represent single columns.

# 1)  add a doublequote at the begining of the line. first column is the ip adress. 
#     ip-adresses that have 3 digits in every group but the first could be interpreted as numbers 
#     with the dots marking groups of thousands.

# 2a) end the ip-adress with quotes
# 2b) surround the second (to me unknown) column thats always just "-" and the
#     third column which is the username with quotes
# 2c) reformat the date from "[09/Jul/2012:11:17:47" to "09.Jul 2012 11:17:47"

# 3)  remove the string "+0200]" (replace it with doublequotes to end the date column)

# 4)  the string that contains the command (5th column) sometimes contains string representation 
#     of binary rubish. thats no problem as long as this does not contain a doublequote which 
#     will mess up the column zoning. According to my web searches, csv columns should allow to 
#     contain doublequotes if they are escaped with a backslash. Although this is the case with
#     these problematic strings, Libre Office does not accept it that way. therefore we escape every 
#     doublequote with a doubleqoute, which is the other valid option according to csv specifications,
#     and libre office does accept that one. More technical: we replace every doublequote that does
#     neither have a space or another doublequote before it, neither after it, with two doublequotes.

sed \
-e 's/^/"/' \
-e 's/ \([^ ]\{1,\}\) \([^ ]\{1,\}\) \[\([0-9]\{1,2\}\)\/\([a-zA-Z]\{1,3\}\)\/\([0-9]\{1,4\}\):/" "\1" "\2" "\3.\4 \5 /' \
-e 's/ +0200\] /" /' \
-e 's/\([^" ]\)"\([^" ]\)/\1""\2/g'
于 2012-07-29T19:00:05.843 回答
2

这实际上只是对@kaefert 答案的修改。我确信有一种更清洁的方法可以做到这一点,但这很好用。

alias aplogcsv="sed -e 's/^/\"/' \
                -e 's/:\([0-9]\{1,3\}\.\)\([0-9]\{1,3\}\.\)\([0-9]\{1,3\}\.\)\([0-9]\{1,3\}\)/\",\"\1\2\3\4/' \
                -e 's/ \([^ ]\{1,\}\) \([^ ]\{1,\}\) \[\([0-9]\{1,2\}\)\/\([a-zA-Z]\{1,3\}\)\/\([0-9]\{1,4\}\):/\",\"\1\" \"\2\" \"\3 \4 \5\",\" /' \
                -e 's/ \([0-9]\{1,2\}\):\([0-9]\{1,2\}\):\([0-9]\{1,2\}\)/\1:\2:\3/' \
                -e 's/ -0700\] /\",/' \
                -e 's/\"GET /\"GET\",\"/g' \
                -e 's/\"POST /\"POST\",\"/g' \
                -e 's/ HTTP\/1.1\" \([0-9]\{1,3\}\) \([0-9]\{1,4\}\) /\",\"HTTP\/1.1\",\1,\2,/' \
                -e 's/\"-\" //g'"

然后我像这样使用它:

aplogcsv access.log > ~/access.log.csv

但它也很容易像这样使用:

grep "25/Jan/2019" access.log | aplogcsv > ~/20190125.access.log.csv
于 2019-01-25T18:40:23.920 回答