Find centralized, trusted content and collaborate around the technologies you use most.
Teams
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
我理解正则化通常会在损失中增加 k*w^2 以惩罚较大的权重。但是在 Keras 中有两个正则化器参数 - weight_regularizer 和 activity_regularizer。有什么不同?
不同之处在于activity_regularizer应用于中间层的输出,它惩罚大层输出。
activity_regularizer