1

我有如下火花数据框:

+-------+----------+-----+
| Status|  date    |count|
+-------+----------+-----+
|Success|2019-09-06|23596|
|Failure|2019-09-06| 2494|
|Failure|2019-09-07| 1863|
|Success|2019-09-07|22399|

我正在尝试按日期计算成功/失败的百分比并将结果添加到相同的 pyspark 数据框中。在创建多个中间表/数据框后,我只能按组计算成功率或失败率。我们如何在不创建新的中间数据帧的情况下使用相同的单个数据帧?

预期输出:

+-------+----------+-----+----------------------
| Status|  date    |count| Percent             |
+-------+----------+-----+----------------------
|Success|2019-09-06|23596| =(23596/(23596+2494)*100)
|Failure|2019-09-06| 2494| =(2494/(23596+2494)*100)
|Failure|2019-09-07| 1863| = (1863/(1863 + 22399)*100)
|Success|2019-09-07|22399| = (22399/(1863 + 22399)*100)
4

1 回答 1

1

您可以window在“日期”列上使用 a 将相同的日期放在一起,然后sum在此窗口上使用“计数”列:

import pyspark.sql.functions as F
from pyspark.sql.window import Window

window = Window.partitionBy(['date'])
df = df.withColumn('Percent', F.col('count')/F.sum('count').over(window)*100)

df.show()
+-------+-------------------+-----+-----------------+
| Status|               date|count|          Percent|
+-------+-------------------+-----+-----------------+
|Failure|2019-09-07 00:00:00| 1883|7.754715427065316|
|Success|2019-09-07 00:00:00|22399|92.24528457293468|
|Success|2019-09-06 00:00:00|23596|90.44078190877731|
|Failure|2019-09-06 00:00:00| 2494|9.559218091222691|
+-------+-------------------+-----+-----------------+
于 2019-09-16T15:09:44.750 回答