-1

全部

我在 Caffe 中使用批量标准化时遇到了问题。这是我在 train_val.prototxt 中使用的代码。

layer {
      name: "conv1"
      type: "Convolution"
      bottom: "conv0"
      top: "conv1"
      param {
        lr_mult: 1
        decay_mult: 1
      }
      param {
        lr_mult: 0
        decay_mult: 0
      }
      convolution_param {
        num_output: 32
        pad: 1
        kernel_size: 3
        weight_filler {
          type: "gaussian"
          std: 0.0589
        }
        bias_filler {
          type: "constant"
          value: 0
        }
        engine: CUDNN
      }
    }
    layer {
      name: "bnorm1"
      type: "BatchNorm"
      bottom: "conv1"
      top: "conv1"
      batch_norm_param {
        use_global_stats: false
      }
    }
    layer {
      name: "scale1"
      type: "Scale"
      bottom: "conv1"
      top: "conv1"
      scale_param {
        bias_term: true
      }
    }
    layer {
      name: "relu1"
      type: "ReLU"
      bottom: "conv1"
      top: "conv1"
    }

layer {
  name: "conv16"
  type: "Convolution"
  bottom: "conv1"
  top: "conv16"
  param {
    lr_mult: 1
    decay_mult: 1
  }

然而,训练并没有收敛。通过去除 BN 层(batchnorm + scale),训练可以收敛。所以我开始比较有或没有BN层的日志文件。以下是 debug_info = true 的日志文件:

使用 BN:

I0804 10:22:42.074671  8318 net.cpp:638]     [Forward] Layer loadtestdata, top blob data data: 0.368457
I0804 10:22:42.074757  8318 net.cpp:638]     [Forward] Layer loadtestdata, top blob label data: 0.514496
I0804 10:22:42.076117  8318 net.cpp:638]     [Forward] Layer conv0, top blob conv0 data: 0.115678
I0804 10:22:42.076200  8318 net.cpp:650]     [Forward] Layer conv0, param blob 0 data: 0.0455077
I0804 10:22:42.076273  8318 net.cpp:650]     [Forward] Layer conv0, param blob 1 data: 0
I0804 10:22:42.076539  8318 net.cpp:638]     [Forward] Layer relu0, top blob conv0 data: 0.0446758
I0804 10:22:42.078435  8318 net.cpp:638]     [Forward] Layer conv1, top blob conv1 data: 0.0675479
I0804 10:22:42.078516  8318 net.cpp:650]     [Forward] Layer conv1, param blob 0 data: 0.0470226
I0804 10:22:42.078589  8318 net.cpp:650]     [Forward] Layer conv1, param blob 1 data: 0
I0804 10:22:42.079108  8318 net.cpp:638]     [Forward] Layer bnorm1, top blob conv1 data: 0
I0804 10:22:42.079197  8318 net.cpp:650]     [Forward] Layer bnorm1, param blob 0 data: 0
I0804 10:22:42.079270  8318 net.cpp:650]     [Forward] Layer bnorm1, param blob 1 data: 0
I0804 10:22:42.079350  8318 net.cpp:650]     [Forward] Layer bnorm1, param blob 2 data: 0
I0804 10:22:42.079421  8318 net.cpp:650]     [Forward] Layer bnorm1, param blob 3 data: 0
I0804 10:22:42.079505  8318 net.cpp:650]     [Forward] Layer bnorm1, param blob 4 data: 0
I0804 10:22:42.080267  8318 net.cpp:638]     [Forward] Layer scale1, top blob conv1 data: 0
I0804 10:22:42.080345  8318 net.cpp:650]     [Forward] Layer scale1, param blob 0 data: 1
I0804 10:22:42.080418  8318 net.cpp:650]     [Forward] Layer scale1, param blob 1 data: 0
I0804 10:22:42.080651  8318 net.cpp:638]     [Forward] Layer relu1, top blob conv1 data: 0
I0804 10:22:42.082074  8318 net.cpp:638]     [Forward] Layer conv16, top blob conv16 data: 0
I0804 10:22:42.082154  8318 net.cpp:650]     [Forward] Layer conv16, param blob 0 data: 0.0485365
I0804 10:22:42.082226  8318 net.cpp:650]     [Forward] Layer conv16, param blob 1 data: 0
I0804 10:22:42.082675  8318 net.cpp:638]     [Forward] Layer loss, top blob loss data: 42.0327

没有 BN:

I0803 17:01:29.700850 30274 net.cpp:638]     [Forward] Layer loadtestdata, top blob data data: 0.320584
I0803 17:01:29.700920 30274 net.cpp:638]     [Forward] Layer loadtestdata, top blob label data: 0.236383
I0803 17:01:29.701556 30274 net.cpp:638]     [Forward] Layer conv0, top blob conv0 data: 0.106141
I0803 17:01:29.701633 30274 net.cpp:650]     [Forward] Layer conv0, param blob 0 data: 0.0467062
I0803 17:01:29.701692 30274 net.cpp:650]     [Forward] Layer conv0, param blob 1 data: 0
I0803 17:01:29.701835 30274 net.cpp:638]     [Forward] Layer relu0, top blob conv0 data: 0.0547961
I0803 17:01:29.702193 30274 net.cpp:638]     [Forward] Layer conv1, top blob conv1 data: 0.0716117
I0803 17:01:29.702267 30274 net.cpp:650]     [Forward] Layer conv1, param blob 0 data: 0.0473551
I0803 17:01:29.702327 30274 net.cpp:650]     [Forward] Layer conv1, param blob 1 data: 0
I0803 17:01:29.702425 30274 net.cpp:638]     [Forward] Layer relu1, top blob conv1 data: 0.0318472
I0803 17:01:29.702781 30274 net.cpp:638]     [Forward] Layer conv16, top blob conv16 data: 0.0403702
I0803 17:01:29.702847 30274 net.cpp:650]     [Forward] Layer conv16, param blob 0 data: 0.0474007
I0803 17:01:29.702908 30274 net.cpp:650]     [Forward] Layer conv16, param blob 1 data: 0
I0803 17:01:29.703228 30274 net.cpp:638]     [Forward] Layer loss, top blob loss data: 11.2245

奇怪的是,在前进中,以 batchnorm 开头的每一层都给出 0 !!!另外值得一提的是,Relu(in-place layer)只有 4 行,但是 batchnorm 和 scale(应该也是 in-place 层)在日志文件中有 6 行和 3 行。你知道有什么问题吗。

4

1 回答 1

0

我不知道您的"BatchNorm"层有什么问题,但很奇怪
根据您的调试日志,您的"BatchNorm"层有5 个(!)内部参数 blob(0..4)。查看源代码batch_norm_layer.cpp应该只有3 个内部参数 blob:

this->blobs_.resize(3);

我建议你确保"BatchNorm"你正在使用的实现不是错误的。


关于调试日志,您可以在此处阅读有关如何解释它的更多信息。
解决您的问题

“Relu [...] 只有 4 行,但 batchnorm 和 scale [...] 在日志文件中有 6 和 3 行”

请注意,每一层都有一行用于"top blob ... data"报告输出 blob 的 L2 范数。
此外,每一层的每个内部权重都有一条额外的线。 "ReLU"层没有内部参数,因此"param blob [...] data"该层没有打印。"Convolution"层有两个内部参数(内核和偏差),因此对于 blob 0 和 blob 1 有额外的两行。

于 2017-08-08T06:47:32.477 回答