我获取了您提供的数据并创建了一个小的 .csv 文件,以便您可以复制......此外,我更改了几个值来测试它是否有效:
Index,Date,City,State,ID,County,Age,A,B,C
0,9/1/16,X,AL,360,BB County,29.0,negative,positive,positive
1,9/1/16,X,AL,360,BB County,1.0,positive,negative,negative
2,9/1/16,X,AL,360,BB County,10.0,positive,negative,negative
3,9/1/16,X,AL,360,BB County,11.0,negative,negative,negative
4,9/1/16,X,AR,718,LL County,67.0,negative,negative,negative
5,9/2/16,X,AR,728,JJ County,3.0,negative,negative,negative
6,9/2/16,X,AR,728,JJ County,8.0,positive,negative,negative
7,9/2/16,X,AR,728,JJ County,8.0,negative,negative,negative
8,9/3/16,X,AR,728,JJ County,14.0,negative,negative,negative
9,9/3/16,X,AR,728,JJ County,5.0,negative,negative,negative
阅读后,情况如下:
>>> X = pd.read_csv('data.csv', header=0, index_col=None).drop('Index', axis=1)
>>> print(X)
Date City State ID County Age A B C
0 9/1/16 X AL 360 BB County 29.0 negative positive positive
1 9/1/16 X AL 360 BB County 1.0 positive negative negative
2 9/1/16 X AL 360 BB County 10.0 positive negative negative
3 9/1/16 X AL 360 BB County 11.0 negative negative negative
4 9/1/16 X AR 718 LL County 67.0 negative negative negative
5 9/2/16 X AR 728 JJ County 3.0 negative negative negative
6 9/2/16 X AR 728 JJ County 8.0 positive negative negative
7 9/2/16 X AR 728 JJ County 8.0 negative negative negative
8 9/3/16 X AR 728 JJ County 14.0 negative negative negative
9 9/3/16 X AR 728 JJ County 5.0 negative negative negative
这是应用于groupby
调用中每个组的函数:
def _ct_id_pos(grp):
return grp[grp.A == 'positive'].shape[0], grp.shape[0]
这将是一个两步过程......使用 pandas,您可以按几列分组并应用上述功能。
# the following will have the tuple in one column
>>> X_prime = X.groupby(['Date', 'ID']).apply(_ct_id_pos).reset_index()
>>> print(X_prime)
Date ID 0
0 9/1/16 360 (2, 4)
1 9/1/16 718 (0, 1)
2 9/2/16 728 (1, 3)
3 9/3/16 728 (0, 2)
请注意 groupby 函数的结果为我们提供了一个带有嵌入元组的新列,因此下一步是将它们拆分为各自的列并删除嵌入的列:
>>> X_prime[['Positive', 'Total']] = X_prime[0].apply(pd.Series)
>>> X_prime.drop([0], axis=1, inplace=True)
>>> print(X_prime)
Date ID Positive Total
0 9/1/16 360 2 4
1 9/1/16 718 0 1
2 9/2/16 728 1 3
3 9/3/16 728 0 2