有人可以向我推荐线性回归包,它不仅会运行回归,还会计算每个回归系数的显着性标准(std / mean)并将它们与具有(Nk)“度的适当p值进行比较自由”?或者至少会提供可以用来计算的输出?
理想情况下,对于 Python,但也将采用 R
谢谢!
有人可以向我推荐线性回归包,它不仅会运行回归,还会计算每个回归系数的显着性标准(std / mean)并将它们与具有(Nk)“度的适当p值进行比较自由”?或者至少会提供可以用来计算的输出?
理想情况下,对于 Python,但也将采用 R
谢谢!
在 R 中,lm()
将拟合线性模型并summary()
给出完整的输出,包括系数估计、标准误差、t 统计量和 p 值。https://stat.ethz.ch/R-manual/R-patched/library/stats/html/lm.html
statsmodels 为线性回归和其他估计模型提供所有标准推理。
下面的输出是从这个笔记本 http://statsmodels.sourceforge.net/stable/examples/notebooks/generated/formulas.html复制的
有一些解释的博客:
http://www.datarobot.com/blog/multiple-regression-using-statsmodels/
mod = ols(formula='Lottery ~ Literacy + Wealth + Region', data=df)
res = mod.fit()
print(res.summary())
OLS Regression Results
==============================================================================
Dep. Variable: Lottery R-squared: 0.338
Model: OLS Adj. R-squared: 0.287
Method: Least Squares F-statistic: 6.636
Date: Tue, 02 Dec 2014 Prob (F-statistic): 1.07e-05
Time: 12:52:16 Log-Likelihood: -375.30
No. Observations: 85 AIC: 764.6
Df Residuals: 78 BIC: 781.7
Df Model: 6
Covariance Type: nonrobust
===============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
-------------------------------------------------------------------------------
Intercept 38.6517 9.456 4.087 0.000 19.826 57.478
Region[T.E] -15.4278 9.727 -1.586 0.117 -34.793 3.938
Region[T.N] -10.0170 9.260 -1.082 0.283 -28.453 8.419
Region[T.S] -4.5483 7.279 -0.625 0.534 -19.039 9.943
Region[T.W] -10.0913 7.196 -1.402 0.165 -24.418 4.235
Literacy -0.1858 0.210 -0.886 0.378 -0.603 0.232
Wealth 0.4515 0.103 4.390 0.000 0.247 0.656
==============================================================================
Omnibus: 3.049 Durbin-Watson: 1.785
Prob(Omnibus): 0.218 Jarque-Bera (JB): 2.694
Skew: -0.340 Prob(JB): 0.260
Kurtosis: 2.454 Cond. No. 371.
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.