5

我需要计算每行的加权平均值(6M+ 行),但这需要很长时间。带权重的列是字符域,所以不能直接使用 weighted.mean。

后台数据:

library(data.table)
library(stringr)
values <- c(1,2,3,4)
grp <- c("a", "a", "b", "b")
weights <- c("{10,0,0,0}", "{0,10,0,0}", "{10,10,0,0}", "{0,0,10,0}")
DF <- data.frame(cbind(grp, weights))
DT <- data.table(DF)

string.weighted.mean <- function(weights.x) {
  tmp.1 <- na.omit(as.numeric(unlist(str_split(string=weights.x, pattern="[^0-9]+"))))
  tmp.2 <- weighted.mean(x=values, w=tmp.1)
}

以下是如何使用 data.frames 完成(太慢):

DF$wm <- mapply(string.weighted.mean, DF$weights)

这可以完成工作,但太慢了(小时):

DT[, wm:=mapply(string.weighted.mean, weights)]

如何改写最后一行以加快速度?

4

2 回答 2

6
DT[, rowid := 1:nrow(DT)]
setkey(DT, rowid)
DT[, wm :={
    weighted.mean(x=values, w=na.omit(as.numeric(unlist(str_split(string=weights, pattern="[^0-9]+")))))     
}, by=rowid]
于 2013-01-23T01:23:59.103 回答
2

由于该组似乎与加权平均值的计算没有任何关系,因此我尝试将问题简化一点。

     values <- seq(4)

# A function to compute a string of length 4 with random weights 0 or 10
     tstwts <- function()
     {
         w <- sample( c(0, 10), 4, replace = TRUE )
         paste0( "{", paste(w, collapse = ","), "}" )
     }

# Generate 100K strings and put them into a vector
     u <- replicate( 1e5, tstwts() )
     head(u)   # Check
     table(u)

# Function to compute a weighted mean from a string using values 
# as an assumed external numeric vector 'values' of the same length as
# the weights
    f <- function(x)
         {
             valstr <- gsub( "[\\{\\}]", "", x )
             wts <- as.numeric( unlist( strsplit(valstr, ",") ) )
             sum(wts * values) / sum(wts) 
         }

# Execute the function f recursively on the vector of weights u
    v <- sapply(u, f)

# Some checks:
    head(v)
    table(v)

在我的系统上,重复 10 万次,

> system.time(sapply(u, f))
   user  system elapsed 
   3.79    0.00    3.83

这个(无组)的数据表版本将是

DT <- data.table( weights = u )
DT[, wt.mean := lapply(weights, f)] )
head(DT)
dim(DT)

在我的系统上,这需要

system.time( DT[, wt.mean := lapply( weights, f )] ) 用户系统经过 3.62 0.03 3.69

因此,在与我的系统相当的系统(Win7、2.8GHz 双核芯片、8GB RAM)上,预计每百万次观察大约有 35-40 秒。YMMV。

于 2013-01-23T05:17:37.187 回答