0

I'm using this article to implement a neural network with backpropagation, but having trouble calculating errors. In a nutshell, my sigmoid function is squashing all my node outputs to 1.0, which then causes the error calculation to return 0:

error = (expected - actual) * (1 - actual) * actual
                                    ^^ this term causes multiply by 0

And so my error is always 0.

I suspect that the problem lies with my sigmoid implementation, which is returning 1.0, rather than asymptotically bounding below 1.0:

# ruby
def sigmoid(x)
  1/(1+Math.exp(-x))
end

Am I correct that sigmoid should never actually reach 1.0, or have I got something else wrong?

4

1 回答 1

1

在数学上下文中,你是正确的,sigmoid 永远不应该达到 1.0。然而,在实际的编程环境中,Math.exp(-x) 最终会变得非常小,以至于它与 0 之间的差异可以忽略不计,您将得到 1.0 的结果。根据 x 的范围,这不会是令人惊讶的结果。

为了使用 sigmoid 方法,您应该使每个节点的传入权重总和大约为 1。这将使 sigmoid 的输出合理,并使您的权重更快地收敛。

于 2012-09-30T20:16:03.083 回答