我正在尝试通过使用 Robert Layton 的 Learning Data Mining with Python 中的代码来学习神经网络/千层面。我想我正在按照代码进行操作,但我收到以下错误消息。非常感谢任何提示或直觉我做错了什么;
Traceback (most recent call last):
File "<ipython-input-78-3ff2950373de>", line 3, in <module>
updates=lasagne.updates.sgd(loss,all_params,learning_rate=0.01)
File "C:\Users\WouterD\Anaconda\lib\site-packages\lasagne\updates.py", line 134, in sgd
grads = get_or_compute_grads(loss_or_grads, params)
File "C:\Users\WouterD\Anaconda\lib\site-packages\lasagne\updates.py", line 110, in get_or_compute_grads
return theano.grad(loss_or_grads, params)
File "C:\Users\WouterD\Anaconda\lib\site-packages\theano-0.7.0-py2.7.egg\theano\gradient.py", line 551, in grad
handle_disconnected(elem)
File "C:\Users\WouterD\Anaconda\lib\site-packages\theano-0.7.0-py2.7.egg\theano\gradient.py", line 538, in handle_disconnected
raise DisconnectedInputError(message)
DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: W
Backtrace when the node is created:
File "C:\Users\WouterD\Anaconda\lib\site-packages\theano-0.7.0-py2.7.egg\theano\compile\sharedvalue.py", line 248, in shared
utils.add_tag_trace(var)
下面的代码:
from sklearn.datasets import load_iris
iris=load_iris()
X=iris.data.astype(np.float32)
y_true=iris.data.astype(np.int32)
from sklearn.cross_validation import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X,y_true,random_state=14)
import lasagne
input_layer=lasagne.layers.InputLayer(shape=(10,X.shape[1]))
hidden_layer=lasagne.layers.DenseLayer(input_layer,num_units=12,nonlinearity=lasagne.nonlinearities.sigmoid)
output_layer=lasagne.layers.DenseLayer(hidden_layer,num_units=3,nonlinearity=lasagne.nonlinearities.softmax)
import theano.tensor as T
net_input=T.matrix('net_input')
net_output=output_layer.get_output_for(net_input)
true_output=T.ivector("true_output")
loss=T.mean(T.nnet.categorical_crossentropy(net_output,true_output))
all_params=lasagne.layers.get_all_params(output_layer)
updates=lasagne.updates.sgd(loss,all_params,learning_rate=0.01)