1

我想知道是否有任何方法可以计算我已经在一天内的不同时间段使用深度特征合成(即计数、总和、平均值等)的所有相同变量?

即早晨事件的计数(0-12 小时)作为晚上事件(13-24)的一个独立变量。

此外,同样,按星期几、每月几日、每年几日等最容易获得计数的方法是什么。自定义聚合原语?

4

1 回答 1

2

是的,这是可能的。首先,让我们生成一些随机数据,然后我将演示如何

import featuretools as ft
import pandas as pd
import numpy as np

# make some random data
n = 100
events_df = pd.DataFrame({
    "id" : range(n),
    "customer_id": np.random.choice(["a", "b", "c"], n),
    "timestamp": pd.date_range("Jan 1, 2019", freq="1h", periods=n),
    "amount": np.random.rand(n) * 100 
})

def to_part_of_day(x):
    if x < 12:
        return "morning"
    elif x < 18:
        return "afternoon"
    else:
        return "evening"

events_df["time_of_day"] = events_df["timestamp"].dt.hour.apply(to_part_of_day)

events_df

我们要做的第一件事是为要计算特征的段添加一个新列

def to_part_of_day(x):
    if x < 12:
        return "morning"
    elif x < 18:
        return "afternoon"
    else:
        return "evening"

events_df["time_of_day"] = events_df["timestamp"].dt.hour.apply(to_part_of_day)

现在我们有一个这样的数据框

   id customer_id           timestamp     amount time_of_day
0   0           a 2019-01-01 00:00:00  44.713802     morning
1   1           c 2019-01-01 01:00:00  58.776476     morning
2   2           a 2019-01-01 02:00:00  94.671566     morning
3   3           a 2019-01-01 03:00:00  39.271852     morning
4   4           a 2019-01-01 04:00:00  40.773290     morning
5   5           c 2019-01-01 05:00:00  19.815855     morning
6   6           a 2019-01-01 06:00:00  62.457129     morning
7   7           b 2019-01-01 07:00:00  95.114636     morning
8   8           b 2019-01-01 08:00:00  37.824668     morning
9   9           a 2019-01-01 09:00:00  46.502904     morning

接下来,让我们将它加载到我们的实体集中

es = ft.EntitySet()
es.entity_from_dataframe(entity_id="events",
                         time_index="timestamp",
                         dataframe=events_df)

es.normalize_entity(new_entity_id="customers", index="customer_id", base_entity_id="events")

es.plot()

在此处输入图像描述

现在,我们准备好设置我们想要创建聚合的段,使用interesting_values

es["events"]["time_of_day"].interesting_values = ["morning", "afternoon", "evening"]

然后我们可以运行 DFS 并将我们想要在每个段的基础上执行的聚合原语放在where_primitives参数中

fm, fl = ft.dfs(target_entity="customers",
                entityset=es,
                agg_primitives=["count", "mean", "sum"],
                trans_primitives=[],
                where_primitives=["count", "mean", "sum"])

fm

在生成的特征矩阵中,您现在可以看到我们每个上午、下午和晚上都有聚合

             COUNT(events)  MEAN(events.amount)  SUM(events.amount)  COUNT(events WHERE time_of_day = afternoon)  COUNT(events WHERE time_of_day = evening)  COUNT(events WHERE time_of_day = morning)  MEAN(events.amount WHERE time_of_day = afternoon)  MEAN(events.amount WHERE time_of_day = evening)  MEAN(events.amount WHERE time_of_day = morning)  SUM(events.amount WHERE time_of_day = afternoon)  SUM(events.amount WHERE time_of_day = evening)  SUM(events.amount WHERE time_of_day = morning)
customer_id                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  
a                       37            49.753630         1840.884300                                           12                                          7                                         18                                          35.098923                                        45.861881                                        61.036892                                        421.187073                                      321.033164                                     1098.664063
b                       30            51.241484         1537.244522                                            3                                         10                                         17                                          45.140800                                        46.170996                                        55.300715                                        135.422399                                      461.709963                                      940.112160
c                       33            39.563222         1305.586314                                            9                                          7                                         17                                          50.129136                                        34.593936                                        36.015679                                        451.162220                                      242.157549                                      612.266545
于 2019-02-07T19:56:53.277 回答