1

我正在做一些日志分析并每隔几分钟检查一次队列的长度。我知道文件何时进入“队列”(一个简单的文件系统目录)以及何时离开。有了它,我可以以给定的间隔绘制队列的长度。到目前为止一切顺利,尽管代码有点程序化:

ts = pd.date_range(start='2012-12-05 10:15:00', end='2012-12-05 15:45', freq='5t')
tmpdf = df.copy()
for d in ts:
    tmpdf[d] = (tmpdf.date_in < d)&(tmpdf.date_out > d)
queue_length = tmpdf[list(ts)].apply(func=np.sum)

但是,我想将实际长度与给定消耗率(例如每秒 1 次等)的长度进行比较。我不能只减去一个常数,因为队列不能超过零。

我已经做到了,但是以一种非常程序化的方式。我尝试使用 pandas 窗口函数但收效甚微,因为无法访问已经为前一个元素计算的结果。这是我尝试的第一件事是致命的错误:

imagenes_min = 60 * imagenes_sec
def roll(window_vals):
    return max(0.0, window_vals[-1] + window_vals[-2] - imagenes_min)

pd.rolling_apply(arg=imagenes_introducidas, func=roll , window = 2, min_periods=2)

真实的代码是这样的,我觉得太冗长太慢了:

imagenes_sec = 1.05
imagenes_min = imagenes_sec * 60 *5
imagenes_introducidas = df3.aet.resample(rule='5t',how='count')
imagenes_introducidas.head()

def accum_minus(serie, rate):
    acc = 0
    retval = np.zeros(len(serie))
    for i,a in enumerate(serie.values):
       acc = max(0, a + acc - rate)
       retval[i] = acc
    return Series(data=retval, index=serie.index)

est_1 = accum_minus(imagenes_introducidas, imagenes_min)
comparativa = DataFrame(data = { 'real': queue_length, 'est_1_sec': est_1 })
comparativa.plot()

比较

这似乎是一项容易的任务,但我不知道如何正确完成。可能 pandas 不是工具,而是一些 numpy 或 scipy 魔法。

更新: df3 是这样的(省略了一些列):

                               aet             date_out
date_in                                               
2012-12-05 10:08:59.318600  Z2XG17  2012-12-05 10:09:37.172300
2012-12-05 10:08:59.451300  Z2XG17  2012-12-05 10:09:38.048800
2012-12-05 10:08:59.587400  Z2XG17  2012-12-05 10:09:39.044100

更新2:这似乎更快,仍然不是很优雅

imagenes_sec = 1.05
imagenes_min = imagenes_sec * 60 *5
imagenes_introducidas = df3.aet.resample(rule='5t',how='count')

def add_or_zero(x, y):
    return max(0.0, x + y - imagenes_min)

v_add_or_zero = np.frompyfunc(add_or_zero, 2,1)
xx = v_add_or_zero.accumulate(imagenes_introducidas.values, dtype=np.object)

dd = DataFrame(data = {'est_1_sec' : xx, 'real': queue_length}, index=imagenes_introducidas.index)
dd.plot()
4

1 回答 1

2

将入站和出站事件交错到单个帧中怎么样?

In [15]: df
Out[15]: 
                      date_in     aet                    date_out
0  2012-12-05 10:08:59.318600  Z2XG17  2012-12-05 10:09:37.172300
1  2012-12-05 10:08:59.451300  Z2XG17  2012-12-05 10:09:38.048800
2  2012-12-05 10:08:59.587400  Z2XG17  2012-12-05 10:09:39.044100

In [16]: inbnd = pd.DataFrame({'event': 1}, index=df.date_in)

In [17]: outbnd = pd.DataFrame({'event': -1}, index=df.date_out)

In [18]: real_stream = pd.concat([inbnd, outbnd]).sort()

In [19]: real_stream
Out[19]: 
                            event
date                             
2012-12-05 10:08:59.318600      1
2012-12-05 10:08:59.451300      1
2012-12-05 10:08:59.587400      1
2012-12-05 10:09:37.172300     -1
2012-12-05 10:09:38.048800     -1
2012-12-05 10:09:39.044100     -1

在这种格式下(每增加一个减量),队列深度可以很容易地用 cumsum() 计算。

In [20]: real_stream['depth'] = real_stream.event.cumsum()

In [21]: real_stream
Out[21]: 
                            event  depth
date                                    
2012-12-05 10:08:59.318600      1      1
2012-12-05 10:08:59.451300      1      2
2012-12-05 10:08:59.587400      1      3
2012-12-05 10:09:37.172300     -1      2
2012-12-05 10:09:38.048800     -1      1
2012-12-05 10:09:39.044100     -1      0

为了模拟不同的消费率,将所有真实的出站时间戳替换为固定频率的一系列出站时间戳。由于 cumsum() 函数在这种情况下不起作用,我创建了一个计数函数,它采用底值。

In [53]: outbnd_1s = pd.DataFrame({'event': -1},
   ....:                          index=real_stream.event.resample("S").index)

In [54]: fixed_stream = pd.concat([inbnd, outbnd_1s]).sort()

In [55]: def make_floor_counter(floor):
   ....:     count = [0]
   ....:     def process(n):
   ....:         count[0] += n
   ....:         if count[0] < floor
   ....:             count[0] = floor
   ....:         return count[0]
   ....:     return process
   ....: 

In [56]: fixed_stream['depth'] = fixed_stream.event.map(make_floor_counter(0))

In [57]: fixed_stream.head(8)
Out[57]: 
                            event  depth
2012-12-05 10:08:59            -1      0
2012-12-05 10:08:59.318600      1      1
2012-12-05 10:08:59.451300      1      2
2012-12-05 10:08:59.587400      1      3
2012-12-05 10:09:00            -1      2
2012-12-05 10:09:01            -1      1
2012-12-05 10:09:02            -1      0
2012-12-05 10:09:03            -1      0
于 2012-12-20T04:01:03.330 回答