我正在尝试使用 keras 为我的简化神经网络实现 CTC 损失:
def ctc_lambda_func(args):
y_pred, y_train, input_length, label_length = args
return K.ctc_batch_cost(y_train, y_pred, input_length, label_length)
x_train = x_train.reshape(x_train.shape[0],20, 10).astype('float32')
input_data = layers.Input(shape=(20,10,))
x=layers.Convolution1D(filters=256, kernel_size=3, padding="same", strides=1, use_bias=False ,activation= 'relu')(input_data)
x=layers.BatchNormalization()(x)
x=layers.Dropout(0.2)(x)
x=layers.Bidirectional (LSTM(units=200 , return_sequences=True)) (x)
x=layers.BatchNormalization()(x)
x=layers.Dropout(0.2)(x)
y_pred=outputs = layers.Dense(5, activation='softmax')(x)
fun = Model(input_data, y_pred)
# fun.summary()
label_length=np.zeros((3800,1))
input_length=np.zeros((3800,1))
for i in range (3799):
label_length[i,0]=4
input_length[i,0]=5
y_train = np.array(y_train)
x_train = np.array(x_train)
input_length = np.array(input_length)
label_length = np.array(label_length)
loss_out = Lambda(ctc_lambda_func, output_shape=(1,), name='ctc')([y_pred, y_train, input_length, label_length])
model =keras.models.Model(inputs=[input_data, y_train, input_length, label_length], outputs=loss_out)
model.compile(loss={'ctc': lambda y_train, y_pred: y_pred}, optimizer = 'adam')
model.fit(x=[x_train, y_train, input_length, label_length], epochs=10, batch_size=100)
我们有 (3800,4) 维度的 y_true (或 y_train),因为我把 label_length=4 和 input_length=5 (空白+1)
我面临这个错误:
ValueError: Input tensors to a Model must come from `tf.keras.Input`. Received: [[0. 1. 0. 0.]
[0. 1. 0. 0.]
[0. 1. 0. 0.]
...
[1. 0. 0. 0.]
[1. 0. 0. 0.]
[1. 0. 0. 0.]] (missing previous layer metadata).
y_true 是这样的:
[[0. 1. 0. 0.]
[0. 1. 0. 0.]
...
[1. 0. 0. 0.]
[1. 0. 0. 0.]
[1. 0. 0. 0.]]
我的问题是什么?