0

I'm trying to get started with neural network and implement boolean functions like AND/OR. Instead of using 0, and 1 as binary inputs, they use -1 and +1. Is there a reason why we cannot use (0, 1)? As an example: http://www.youtube.com/watch?v=Ih5Mr93E-2c

4

2 回答 2

2

If you really mean inputs, there is no restriction on using {-1,1}. You can just as easily use {0,1} or any other pair of real numbers (e.g., {6,42}) to define your True/False input values.

What may be confusing you in the charts is that {-1,1} are used as the outputs of the neurons. The reason for that, as @Memming stated, is because of the activation function used for the neuron. If tanh is used for the activation function, the neuron's output will be in the range (-1,1), whereas if you use a logistic function, its output will be in the range (0,1). Either will work for a multi-layer perceptron - you just have to define your target value (expected output) accordingly.

于 2013-10-01T14:02:39.383 回答
1

In most cases there's no difference. Just use logistic function activation instead of tanh. In some special forms, e.g., Ising model, it could nontrivially change the parameter space though.

于 2013-10-01T12:55:38.073 回答