出于学习目的,我的任务是在 tensorflow 中进行线性和 sigmoid 操作。我设法做了线性运算:
def linear_op_forward(X, W):
''' linear operation'''
return np.dot(X, W.T)
def linear_op_backward(op, grads):
''' Linear gradient realization '''
X = op.inputs[0]
W = op.inputs[1]
dX = tf.multiply(grads, W)
dW = tf.reduce_sum(tf.multiply(X, grads),
axis = 0,
keep_dims = True)
return dX, dW
但我坚持使用 sigmoid 操作:
那是对的吗?
def sigmoid_op_forward(X):
return 1 / (1 + np.exp(-X))
而且我很难理解 sigmoid 梯度
def sigmoid_op_backward(op, grads):
???
有人可以帮忙吗?