Layer (type) Output Shape Param #
=================================================================
input_13 (InputLayer) (None, 5511, 101) 0
_________________________________________________________________
conv1d_13 (Conv1D) (None, 1375, 196) 297136
_________________________________________________________________
batch_normalization_27 (Batc (None, 1375, 196) 784
_________________________________________________________________
activation_13 (Activation) (None, 1375, 196) 0
_________________________________________________________________
dropout_34 (Dropout) (None, 1375, 196) 0
_________________________________________________________________
gru_18 (GRU) (None, 1375, 128) 124800
_________________________________________________________________
dropout_35 (Dropout) (None, 1375, 128) 0
_________________________________________________________________
batch_normalization_28 (Batc (None, 1375, 128) 512
_________________________________________________________________
gru_19 (GRU) (None, 1375, 128) 98688
_________________________________________________________________
dropout_36 (Dropout) (None, 1375, 128) 0
_________________________________________________________________
batch_normalization_29 (Batc (None, 1375, 128) 512
_________________________________________________________________
dropout_37 (Dropout) (None, 1375, 128) 0
_________________________________________________________________
time_distributed_11 (TimeDis (None, 1375, 1) 129
=================================================================
Total params: 522,561
Trainable params: 521,657
Non-trainable params: 904
ValueError: Error when checking target: expected time_distributed_3 to have shape (1375, 1) but got array with shape (5511, 101)
我将 .npy 文件作为 cnn 层的输入。数组大小为 (5, 5511, 101) 输入数组有问题吗?如何克服该值错误。我正在使用 keras(jupyter notebook)。我找不到任何解决方案。任何帮助将不胜感激。
代码片段@ErselEr ...这是我用来构建模型的代码
def model(input_shape):
X_input = Input(shape = input_shape)
y = Input(shape = input_shape)
### START CODE HERE ###
# Step 1: CONV layer (≈4 lines)
X = Conv1D(196, kernel_size=15, strides=4)(X_input)
X = BatchNormalization()(X) # Batch normalization
X = Activation('relu')(X) # ReLu activation
X =
X = Dropout(0.8)(X) # dropout (use 0.8)
# Step 2: First GRU Layer (≈4 lines)
X = GRU(units = 128, return_sequences = True)(X) # GRU (use 128 units and return the sequences)
X = Dropout(0.8)(X) # dropout (use 0.8)
X = BatchNormalization()(X) # Batch normalization
# Step 3: Second GRU Layer (≈4 lines)
X = GRU(units = 128, return_sequences = True)(X) # GRU (use 128 units and return the sequences)
X = Dropout(0.8)(X) # dropout (use 0.8)
X = BatchNormalization()(X) # Batch normalization
# dropout (use 0.8)
# Step 4: Time-distributed dense layer (≈1 line)
X = TimeDistributed(Dense(1,activation = "sigmoid"))(X) # time distributed (sigmoid)
### END CODE HERE ###
model = Model(inputs=X_input, outputs=X)
return model