我是新手,我正在尝试用 ReLU 替换以下简单 NN 中的 sigmoid 激活函数。我可以这样做吗?我试过替换 sigmoid 函数,但它不起作用。输出应该是与门(如果输入(0,0)-> 输出 0)。
import numpy as np
# sigmoid function
def nonlin(x, deriv=False):
if(deriv == True):
return x*(1-x)
return 1/(1+np.exp(-x))
# input dataset
X = np.array([[0, 0],
[0, 1],
[1, 0],
[1, 1]])
# output dataset
y = np.array([[0, 0, 0, 1]]).T
# seed random numbers to make calculation
# deterministic (just a good practice)
np.random.seed(1)
# initialize weights randomly with mean 0
syn0 = 2*np.random.random((2, 1)) - 1
for iter in xrange(10000):
# forward propagation
l0 = X
l1 = nonlin(np.dot(l0,syn0))
# how much did we miss?
l1_error = y - l1
l1_delta = l1_error * nonlin(l1, True)
syn0 += np.dot(l0.T,l1_delta)