Find centralized, trusted content and collaborate around the technologies you use most.
Teams
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
我刚刚花了一个小时阅读了一篇关于 Rectified Linear Units (ReLu) 的论文。我发现很难解开所涉及的数学。
基本上只是说,在您进行抽搐和池化之后,您将任何负值更改为 0?
是这样吗?
你几乎是对的。但是,对于大于零的输入值,它变为通常的线性函数。因此,函数为:
f(x) = 0, when x < 0; = x, otherwise (or) f(x) = max(0, x)