1

我有两个 60 x 80921 矩阵,一个充满数据,一个带有参考。
我想将值作为键/值对存储在两个不同的 LMDB 中,一个用于训练(比如我将在 60000 列标记附近切片),一个用于测试。这是我的想法;它有效吗?

X_train = X[:,:60000]
Y_train = Y[:,:60000]
X_test = X[:,60000:]
Y_test = Y[:,60000:]

X_train = X_train.astype(int)
X_test = X_test.astype(int)
Y_train = Y_train.astype(int)
Y_test = Y_test.astype(int)

map_size = X_train.nbytes * 10
env = lmdb.open('sensormatrix_train_lmdb', map_size=map_size)
with env.begin(write=True) as txn:  
    for i in range(60):
        for j in range(60000):
            datum = caffe.proto.caffe_pb2.Datum()
            datum.height = X_train.shape[0]
            datum.width = X_train.shape[1]
            datum.data = X_train[i,j].tobytes()
            datum.label= int(Y[i,j])
            str_id= '{:08}'.format(i)

我真的不确定代码。最后一行format(i)是指什么?

4

1 回答 1

0

不是 100% 清楚您要做什么:您是将每个条目视为单独的数据样本,还是尝试在 dim=60 的 60K 1D 向量上进行训练...

假设你有 60K 的 dim 60 训练样本,你可以这样编写训练 lmdbs:

env_x = lmdb.open('sensormatrix_train_x_lmdb', map_size=map_size) # you can put map_size a little bigger 
env_y = lmdb.open('sensormatrix_train_y_lmdb', map_size=map_size)
with env_x.begin(write=True) as txn_x, env_y.begin(write=True) as txn_y:
    for i in xrange(X_train.shape[1]):
        x = X_train[:,i]
        y = Y_train[:,i] 

        datum_x = caffe.io.array_to_datum(arr=x.reshape((60,1,1)),label=i)
        datum_y = caffe.io.array_to_datum(arr=y.reshape((60,1,1)),label=i)
        keystr = '{:0>10d}'.format(i) # format an lmdb key for this entry
        txn_x.put( keystr, datum_x.SerializeToString() ) # actual write to lmdb
        txn_y.put( keystr, datum_y.SerializeToString() )

现在你有两个用于训练的 lmdb,'prototxt'你应该有两个对应的"Data"层:

layer {
  name: "input_x"
  top: "x"
  top: "idx_x"
  type: "Data"
  data_param { source: "sensormatrix_train_x_lmdb" batch_size: 32 }
  include { phase: TRAIN }
}
layer {
  name: "input_y"
  top: "y"
  top: "idx_y"
  type: "Data"
  data_param { source: "sensormatrix_train_y_lmdb" batch_size: 32 }
  include { phase: TRAIN }
}

为确保您阅读了相应x y的配对,您可以添加健全性检查

layer {
  name: "sanity"
  type: "EuclideanLoss"
  bottom: "idx_x"
  bottom: "idx_y"
  top: "sanity"
  loss_weight: 0 
  propagate_down: false
  propagate_down: false
}
于 2016-04-06T12:11:05.417 回答