0

There is a famous formula (used e.g. in many Metatrader's MQL bound FANN derivatives)

n, n, n/2+1, 1

describing 4-layer (two hidden layers) network. Here n is the number of neurons in an input layer, the first hidden layer has n, the second n/2+1, and the output is a boolean.

Does anybody know the origin of this formula?

4

1 回答 1

0

3 层 MLP 可以逼近连续函数。

4 层 MLP 更加通用,可以处理不连续的函数(想想 step 或 Heaviside 函数)。

拓扑没有特别的韵律或原因,但有四层增强了适用性。

于 2013-02-07T16:36:54.977 回答