我想做以下事情:
import theano, numpy, theano.tensor as T
a = T.fvector('a')
w = theano.shared(numpy.array([1, 2, 3, 4], dtype=theano.config.floatX))
w_sub = w[1]
b = T.sum(a * w)
grad = T.grad(b, w_sub)
在这里,w_sub 是例如 w[1],但我不想在 w_sub 的函数中显式写出 b。尽管经历了这个和其他相关问题,但我无法解决它。
这只是向您展示我的问题。实际上,我真正想做的是与 Lasagne 进行稀疏卷积。权重矩阵中的零项不需要更新,因此不需要计算 的这些项的梯度w
。
现在这是完整的错误消息:
Traceback (most recent call last):
File "D:/Jeroen/Project_Lasagne_General/test_script.py", line 9, in <module>
grad = T.grad(b, w_sub)
File "C:\Anaconda2\lib\site-packages\theano\gradient.py", line 545, in grad
handle_disconnected(elem)
File "C:\Anaconda2\lib\site-packages\theano\gradient.py", line 532, in handle_disconnected
raise DisconnectedInputError(message)
theano.gradient.DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: Subtensor{int64}.0
Backtrace when the node is created:
File "D:/Jeroen/Project_Lasagne_General/test_script.py", line 6, in <module>
w_sub = w[1]