0

我尝试仅在批次结束时更新重量,我知道这是默认行为,但我不明白为什么你需要让你的 X 和 y 大小相同?如果我有 X.shape(12,32,64),其中我使用批量大小 12,那么只有一批为什么不足以拥有 y.shape(1,N)?

我只想在整个批量大小显示给网络后才进行反向传播。为什么要为每个批次项目贴上标签?

示例代码:

def create_model(batch, timesteps, features):
    inputTensor1 = Input(batch_shape=(batch, timesteps, features))
    lstm1 = LSTM(32, stateful=True, dropout=0.2)(inputTensor1)
    x = Dense(4, activation='linear')(lstm1)
    model = Model(inputs=inputTensor1, outputs=x)
    model.compile(loss='mse', optimizer='rmsprop', metrics=['mse'])
    print(model.summary())
    plot_model(model, show_shapes=True, show_layer_names=True)

    return model

X = np.load("").reshape(1280,12,640,32)
y = np.load("").reshape(1280,1,4)

prop_train = 0.8
ntrain = int(X.shape[0]*prop_train)

X_train, X_val = X[:ntrain], X[ntrain:]
y_train, y_val = y[:ntrain], y[ntrain:]

model =create_model(12,640,32)

for j in np.arange(1):
    for i in np.arange(X_train.shape[0]):
        print(i)
        model.reset_states()
        history=model.train_on_batch(X_train[i], y_train[i])

在这里我得到了错误

ValueError: Input arrays should have the same number of samples as target arrays. Found 12 input samples and 1 target samples
4

0 回答 0