除了 L2L_2L2 正则化外,还有一种常用的正则化方法 — dropout (随机失活)
dropout 通过遍历网络的每一层, 并设置消除节点的概率来实现正则化.
最常用的是 inverted dropout(反向随机失活) 来实施 dropout
下面以三层神经网络为例:
Illustrate with layer=3:Illustrate \ with \ layer=3:Illustrate with layer=3:
keep_prob
= 0.8
d3
= np
.random
.rand
(a3
.shape
[0], a3
.shape
[1]) < keep_prob
a3
= np
.multiply
(a3
, d3
)
a3
/= keep_prob
注意: 测试阶段不适用 dropout