0

我的问题是在 MSE 中获得更好的结果,我希望在 R2 中像在线性模型中一样。所以我使用了 sklearn.neural_network.MLPRegressor 库来比较它

def testfit(v,k,subset=2,hls=(50,50,50,10),acv='identity'): 
# prep variables
n = len(v)
n1 = n/subset
X = getX(v[0:n1],k)
y = gety(v[0:n1],k)
# define models
nn1 = MLPRegressor(hidden_layer_sizes=hls,  activation=acv, solver='adam',    alpha=0.01,batch_size='auto',
       learning_rate='constant', learning_rate_init=0.1, power_t=1, max_iter=50000, shuffle=True,
       random_state=None, tol=0.00001, verbose=False, warm_start=False, momentum=0.9,
       nesterovs_momentum=True, early_stopping=False, validation_fraction=0.5, beta_1=0.9, beta_2=0.999,
       epsilon=1e-10)


ols = linear_model.LinearRegression()
# run models
st = time.time()
fnnw = nn1.fit(X,y)
nnw_dur = time.time() - st
st = time.time()
flin = ols.fit(X,y)
ols_dur = time.time() - st
# run gof   
X2 = getX(v[n1:n],k)
y2 = gety(v[n1:n],k)
# neural network
# in-sample
yn = fnnw.predict(X)
gin = pearsonr(y,yn)[0]**2
ginse = sum((y-yn)**2)
# out-sample    
yn2 = fnnw.predict(X2)
oin = pearsonr(y2,yn2)[0]**2
oinse = sum((y2-yn2)**2)
# ols
# in.sample
yl = flin.predict(X)
gil = pearsonr(y,yl)[0]**2
gilse = sum((y-yl)**2)
yl2 = flin.predict(X2)
oil = pearsonr(y2,yl2)[0]**2
oilse = sum((y2-yl2)**2)
plt.subplot(321)
plt.plot(y2)
plt.plot(yl2)

在此处输入图像描述

这种情况下最好的情况是我的神经网络 NNW MSE 在 FORCAST +1 小于 OLS MSE FORCAST +1

或者不可能以这种方式在线性模型中获得更小的误差

4

0 回答 0