scipy.stats.entropy
计算连续随机变量的微分熵. 究竟是用什么估计方法,用什么公式计算微分熵的?(即分布的微分熵与norm
分布的熵beta
)
下面是它的github代码。微分熵是 pdf 乘以 log pdf 的负积分和,但我看不到这个或写的 log。可能是在调用 tointegrate.quad
吗?
def _entropy(self, *args):
def integ(x):
val = self._pdf(x, *args)
return entr(val)
# upper limit is often inf, so suppress warnings when integrating
_a, _b = self._get_support(*args)
with np.errstate(over='ignore'):
h = integrate.quad(integ, _a, _b)[0]
if not np.isnan(h):
return h
else:
# try with different limits if integration problems
low, upp = self.ppf([1e-10, 1. - 1e-10], *args)
if np.isinf(_b):
upper = upp
else:
upper = _b
if np.isinf(_a):
lower = low
else:
lower = _a
return integrate.quad(integ, lower, upper)[0]
来源(第 2501 - 2524 行):https ://github.com/scipy/scipy/blob/master/scipy/stats/_distn_infrastructure.py