2

我正在使用bnlearnR,我想知道包如何计算 BIC-g(高斯分布中的 BIC)。

让我们做一个结构,我可以找到BIC分数如下

library(bnlearn)
X = iris[, 1:3]
names(X) = c("A", "B", "C")
Network = empty.graph(names(X))
bnlearn::score(Network, X, type="bic-g")

bnlearn为我提供有关如何计算此分数的更详细信息,

bnlearn::score(Network, X, type="bic-g", debug=TRUE)

这导致

----------------------------------------------------------------
* processing node A.
  > loglikelihood is -184.041441.
  > penalty is 2.505318 x 2 = 5.010635.
----------------------------------------------------------------
* processing node B.
  > loglikelihood is -87.777815.
  > penalty is 2.505318 x 2 = 5.010635.
----------------------------------------------------------------
* processing node C.
  > loglikelihood is -297.588727.
  > penalty is 2.505318 x 2 = 5.010635.
[1] -584.4399

我知道如何计算贝叶斯网络中离散数据的 BIC,请参阅此处。但我不知道它如何推广到联合高斯(多元正态)情况。

当然,它可能与近似似然性和惩罚项有关,并且似乎包过程计算每个节点的似然性和惩罚,然后将它们相加。

bnlearn::score(Network, X, type="loglik-g", debug=TRUE)

但我想知道如何根据数据具体计算可能性和惩罚。

我找到了解释(参见第 57 页)的材料Laplace Approximation,但我无法将其联系起来。

有人帮我吗?

4

1 回答 1

3

BIC 计算为

BIC = -2* logLik + nparams* log(nobs)

但在bnlearn这被重新缩放 -2(见?score)给出

BIC = logLik -0.5* nparams* log(nobs)

因此,对于您的示例,没有边的可能性是使用边际均值计算的,并且错误(或更一般地,对于每个节点,参数的数量是通过求和 1(截距)+ 1(残差)+父母的数量给出的),例如

library(bnlearn)
X = iris[, 1:3]
names(X) = c("A", "B", "C")
Network = empty.graph(names(X))

(ll = sum(sapply(X, function(i) dnorm(i, mean(i), sd(i), log=TRUE)))) 
#[1] -569.408
(penalty = 0.5* log(nrow(X))* 6)
#[1] 15.03191

ll - penalty
#[1] -584.4399

如果存在边缘,则使用拟合值和残差计算对数似然。对于网络:

Network = set.arc(Network, "A", "B")

我们需要来自节点 A 和 C 的对数似然分量

(llA = with(X, sum(dnorm(A, mean(A), sd(A), log=TRUE))))
#[1] -184.0414
(llC = with(X, sum(dnorm(C, mean(C), sd(C), log=TRUE))))
#[1] -297.5887

我们从线性回归中得到 B 的条件概率

m = lm(B ~ A, X)
(llB = with(X, sum(dnorm(B, fitted(m), stats::sigma(m), log=TRUE))))
#[1] -86.73894

给予

(ll = llA + llB + llC)
#[1] -568.3691
(penalty = 0.5* log(nrow(X))* 7)
#[1] 17.53722
ll - penalty
#[1] -585.9063 

#  bnlearn::score(Network, X, type="bic-g", debug=TRUE)
# ----------------------------------------------------------------
# * processing node A.
#    loglikelihood is -184.041441.
#    penalty is 2.505318 x 2 = 5.010635.
# ----------------------------------------------------------------
# * processing node B.
#    loglikelihood is -86.738936.
#    penalty is 2.505318 x 3 = 7.515953.
# ----------------------------------------------------------------
# * processing node C.
#    loglikelihood is -297.588727.
#    penalty is 2.505318 x 2 = 5.010635.
# [1] -585.9063
于 2019-03-01T10:18:01.690 回答