3

My goal is to solve the XOR problem using a Neural Network. I’ve read countless articles on the theory, proof, and mathematics behind a multi-layered neural network. The theory make sense (math… not so much) but I have a few simple questions regarding the evaluation and topology of a Neural Network.

I feel I am very close to solving this problem, but I am beginning to question my topology and evaluation techniques. The complexities of back propagation aside, I just want to know if my approach to evaluation is correct. With that in mind, here are my questions:

  1. Assuming we have multiple inputs, does each respective input get its’ own node? Do we ever input both values into a single node? Does the order in which we enter this information matter?

  2. While evaluating the graph output, does each node fire as soon as it gets a value? Or do we instead collect all the values from the above layer and then fire off once we’ve consumed all the input?

  3. Does the order of evaluation matter? For example, if a given node in layer “b” is ready to fire – but other nodes in that same layer are still awaiting input – should the ready node fire anyway? Or should all nodes in the layer be loaded up before firing?

  4. Should each layer be connected to all nodes in the following layer?

enter image description here

I’ve attached a picture which should help explain (some of) my questions.

Thank you for your time!

4

1 回答 1

4

1)是的,每个输入都有自己的节点,并且该节点始终是该输入类型的节点。顺序无关紧要 - 您只需要保持一致即可。毕竟,未经训练的神经网络可以学习将任何一组线性可分的输入映射到输出,因此不需要按顺序排列节点才能使其工作。

2和3)您需要在下一层中的任何节点触发之前从单个层收集所有值。如果您使用除逐步激活函数之外的任何激活函数,这一点很重要,因为输入的总和会影响向前传播的值。因此,在传播任何内容之前,您需要知道该总和是多少。

4)哪些节点连接到哪些其他节点取决于您。由于您的网络不会太大并且 XOR 是一个相当简单的问题,因此将一层中的所有节点连接到下一层中的所有节点(即完全连接的神经网络)可能是最简单的。在其他问题中可能存在不使用这种拓扑结构更好的特殊情况,但没有一种简单的方法来解决它(大多数人要么使用试错法或遗传算法,如在 NEAT 中),并且出于此问题的目的,您无需担心它。

于 2013-07-02T21:42:38.767 回答