13

寻找时间平均问题的最快解决方案。

我有一个日期时间对象列表。需要求时间的平均值(不包括年、月、日)。这是我到目前为止得到的:

import datetime as dtm
def avg_time(times):
    avg = 0
    for elem in times:
        avg += elem.second + 60*elem.minute + 3600*elem.hour
    avg /= len(times)
    rez = str(avg/3600) + ' ' + str((avg%3600)/60) + ' ' + str(avg%60)
    return dtm.datetime.strptime(rez, "%H %M %S")
4

6 回答 6

6

这是解决此问题的更好方法

生成日期时间样本

In [28]: i = date_range('20130101',periods=20000000,freq='s')

In [29]: i
Out[29]: 
<class 'pandas.tseries.index.DatetimeIndex'>
[2013-01-01 00:00:00, ..., 2013-08-20 11:33:19]
Length: 20000000, Freq: S, Timezone: None

平均 20m 次

In [30]: %timeit pd.to_timedelta(int((i.hour*3600+i.minute*60+i.second).mean()),unit='s')
1 loops, best of 3: 2.87 s per loop

结果为 timedelta(请注意,这需要 numpy 1.7 和 pandas 0.13 的to_timedelta部分,即将推出)

In [31]: pd.to_timedelta(int((i.hour*3600+i.minute*60+i.second).mean()),unit='s')
Out[31]: 
0   11:59:12
dtype: timedelta64[ns]

在几秒钟内(这将适用于 pandas 0.12,numpy >= 1.6)。

In [32]: int((i.hour*3600+i.minute*60+i.second).mean())
Out[32]: 43152
于 2013-10-30T14:52:29.567 回答
6

这是一个简短而甜蜜的解决方案(虽然可能不是最快的)。它获取日期列表中每个日期与某个任意参考日期之间的差异(返回 datetime.timedelta),然后对这些差异求和并取平均值。然后它添加回原始参考日期。

import datetime
def avg(dates):
  any_reference_date = datetime.datetime(1900, 1, 1)
  return any_reference_date + sum([date - any_reference_date for date in dates], datetime.timedelta()) / len(dates)
于 2019-07-12T11:11:43.547 回答
3

我一直在寻找相同的东西,但后来我发现了这一点。一种获取日期时间对象列表平均值的非常简单的方法。

    import datetime
    #from datetime.datetime import timestamp,fromtimestamp,strftime ----> You can use this as well to remove unnecessary datetime.datetime prefix :)  
    def easyAverage(datetimeList): ----> Func Declaration
        sumOfTime=sum(map(datetime.datetime.timestamp,datetimeList))
        '''
         timestamp function changes the datetime object to a unix timestamp sort of a format.
         So I have used here a map to just change all the datetime object into a unix time stamp form , added them using sum and store them into sum variable.
        '''
        length=len(datetimeList) #----> Self Explanatory

        averageTimeInTimeStampFormat=datetime.datetime.fromtimestamp(sumOfTime/length)
        '''
        fromtimestamp function returns a datetime object from a unix timestamp.
        '''

        timeInHumanReadableForm=datetime.datetime.strftime(averageTimeInTimeStampFormat,"%H:%M:%S") #----> strftime to change the datetime object to string.
        return timeInHumanReadableForm

或者你可以在一个简单的行中完成所有这些:

    avgTime=datetime.datetime.strftime(datetime.datetime.fromtimestamp(sum(map(datetime.datetime.timestamp,datetimeList))/len(datetimeList)),"%H:%M:%S")

干杯,

于 2016-09-28T20:24:39.403 回答
2

您至少可以使用sum()生成器表达式来创建总秒数:

from datetime import datetime, date, time

def avg_time(datetimes):
    total = sum(dt.hour * 3600 + dt.minute * 60 + dt.second for dt in datetimes)
    avg = total / len(datetimes)
    minutes, seconds = divmod(int(avg), 60)
    hours, minutes = divmod(minutes, 60)
    return datetime.combine(date(1900, 1, 1), time(hours, minutes, seconds))

演示:

>>> from datetime import datetime, date, time, timedelta
>>> def avg_time(datetimes):
...     total = sum(dt.hour * 3600 + dt.minute * 60 + dt.second for dt in datetimes)
...     avg = total / len(datetimes)
...     minutes, seconds = divmod(int(avg), 60)
...     hours, minutes = divmod(minutes, 60)
...     return datetime.combine(date(1900, 1, 1), time(hours, minutes, seconds))
... 
>>> avg_time([datetime.now(), datetime.now() - timedelta(hours=12)])
datetime.datetime(1900, 1, 1, 7, 13)
于 2013-10-30T12:09:20.983 回答
1

这不是最好的解决方案,但可能会有所帮助:

import datetime as dt

t1 = dt.datetime(2020,12,31,10,00,5)
t2 = dt.datetime(2021,1,1,17,20,15)

delta = t2-t1 #delta is a datetime.timedelta object and can be used in the + operation

avg = t1 + delta/2 #average of t1 and t2
于 2021-01-07T05:37:23.893 回答
0

如果您有日期时间列表:

import pandas as pd
avg=pd.to_datetime(pd.Series(yourdatetimelist)).mean()

如果您有时间增量列表:

import pandas as pd
avg=pd.to_timedelta(pd.Series(yourtimedeltalist)).mean()
于 2021-12-03T09:13:15.540 回答