感谢您提出这个绝妙的问题。原则上,您应该能够使用symfit
开箱即用的方式解决此问题,但您的问题暴露了一些小问题。symfit
这是一个不断变化的项目,所以不幸的是,这会不时发生。但是我创建了一个可以在当前 master 分支上运行的解决方法,我希望尽快发布一个新版本来解决这些问题。
原则上,这是您已经找到的全局拟合示例与LikeLihood目标函数的组合。使用 LogLikelihood,您无需分箱,而是直接使用测量值。但是,LikeLihood 似乎还不能正确处理多组件模型,所以我包含了一个固定版本的 LogLikelihood。
import matplotlib.pyplot as plt
from symfit import (
Fit, parameters, variables, exp, cos, CallableModel, pi, besseli
)
from symfit.core.objectives import LogLikelihood
from symfit.core.minimizers import *
from symfit.core.printing import SymfitNumPyPrinter
# symbolic bessel is not converted into numerical yet, this monkey-patches it.
def _print_besseli(self, expr):
return 'scipy.special.iv({}, {})'.format(*expr.args)
SymfitNumPyPrinter._print_besseli = _print_besseli
# creating the data
mu1, mu2 = .05, -.05 # Are these supposed to be opposite sign?
sigma1, sigma2 = 3.5, 2.5
n1, n2 = 8000, 9000
np.random.seed(42)
x1 = np.random.vonmises(mu1, sigma1, n1)
x2 = np.random.vonmises(mu2, sigma2, n2)
# Create a model for `n` different datasets.
n = 2
x, *xs = variables('x,' + ','.join('x_{}'.format(i) for i in range(1, n + 1)))
ys = variables(','.join('y_{}'.format(i) for i in range(1, n + 1)))
mu, kappa = parameters('mu, kappa')
kappas = parameters(','.join('k_{}'.format(i) for i in range(1, n + 1)),
min=0, max=10)
mu.min, mu.max = - np.pi, np.pi # Bound to 2 pi
# Create a model template, who's symbols we will replace for each component.
template = exp(kappa * cos(x - mu)) / (2 * pi * besseli(0, kappa))
model = CallableModel(
{y_i: template.subs({kappa: k_i, x: x_i}) for y_i, x_i, k_i in zip(ys, xs, kappas)}
)
print(model)
class AlfredosLogLikelihood(LogLikelihood):
def __call__(self, *args, **kwargs):
evaluated_func = super(LogLikelihood, self).__call__(
*args, **kwargs
)
ans = - sum([np.nansum(np.log(component))
for component in evaluated_func])
return ans
fit = Fit(model, x_1=x1, x_2=x2, objective=AlfredosLogLikelihood)
x_axis = np.linspace(- np.pi, np.pi, 101)
fit_result = fit.execute()
print(fit_result)
x1_result, x2_result = model(x_1=x_axis, x_2=x_axis, **fit_result.params)
# plotting
plt.hist(x1, bins=40, density=True, histtype='step', color='#1f77b4')
plt.hist(x2, bins=40, density=True, histtype='step', color='#ff7f0e')
plt.plot(x_axis, x1_result, color='#1f77b4')
plt.plot(x_axis, x2_result, color='#ff7f0e')
plt.show()
这将输出以下内容:
[y_1(x_1; k_1, mu) = exp(k_1*cos(mu - x_1))/(2*pi*besseli(0, k_1)),
y_2(x_2; k_2, mu) = exp(k_2*cos(mu - x_2))/(2*pi*besseli(0, k_2))]
Parameter Value Standard Deviation
k_1 3.431673e+00 None
k_2 2.475649e+00 None
mu 1.030791e-02 None
Status message b'CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH'
Number of iterations 13
我希望这能让你朝着正确的方向前进,并感谢你揭露这个错误;)。