我正在使用 Lasagne 包构建一个简单的 3 层神经网络,并在一个非常简单的数据集(只有 4 个示例)上对其进行测试。
X = np.array([[0,0,1],
[0,1,1],
[1,0,1],
[1,1,1]])
y = np.array([[0, 0],[1, 0],[1, 1],[0, 1]])
然而,它未能学习到这一点,并导致预测:
pred = theano.function([input_var], [prediction])
np.round(pred(X), 2)
array([[[ 0.5 , 0.5 ],
[ 0.98, 0.02],
[ 0.25, 0.75],
[ 0.25, 0.75]]])
完整代码:
def build_mlp(input_var=None):
l_in = lasagne.layers.InputLayer(shape=(None, 3), input_var=input_var)
l_hid1 = lasagne.layers.DenseLayer(
l_in, num_units=4,
nonlinearity=lasagne.nonlinearities.rectify,
W=lasagne.init.GlorotUniform())
l_hid2 = lasagne.layers.DenseLayer(
l_hid1, num_units=4,
nonlinearity=lasagne.nonlinearities.rectify,
W=lasagne.init.GlorotUniform())
l_out = lasagne.layers.DenseLayer(
l_hid2, num_units=2,
nonlinearity=lasagne.nonlinearities.softmax)
return l_out
input_var = T.lmatrix('inputs')
target_var = T.lmatrix('targets')
network = build_mlp(input_var)
prediction = lasagne.layers.get_output(network, deterministic=True)
loss = lasagne.objectives.squared_error(prediction, target_var)
loss = loss.mean()
params = lasagne.layers.get_all_params(network, trainable=True)
updates = lasagne.updates.nesterov_momentum(
loss, params, learning_rate=0.01, momentum=0.9)
train_fn = theano.function([input_var, target_var], loss, updates=updates)
val_fn = theano.function([input_var, target_var], [loss])
训练:
num_epochs = 1000
for epoch in range(num_epochs):
inputs, targets = (X, y)
train_fn(inputs, targets)
我猜隐藏层中使用的非线性函数或学习方法可能存在问题。