0

考虑到您有一个与此类似的基本模型这一事实:

input_layer = layers.Input(shape=(50,20))
layer = layers.Dense(123, activation = 'relu')
layer = layers.LSTM(128, return_sequences = True)(layer)
outputs = layers.Dense(20, activation='softmax')(layer)
model = Model(input_layer,outputs)

您将如何实施 CTC 损失?我从关于 OCR 的 keras 代码教程中尝试了一些东西,如下所示:

class CTCLayer(layers.Layer):
    def __init__(self, name=None):
        super().__init__(name=name)
        self.loss_fn = keras.backend.ctc_batch_cost

    def call(self, y_true, y_pred):
        # Compute the training-time loss value and add it
        # to the layer using `self.add_loss()`.
        batch_len = tf.cast(tf.shape(y_true)[0], dtype="int64")
        input_length = tf.cast(tf.shape(y_pred)[1], dtype="int64")
        label_length = tf.cast(tf.shape(y_true)[1], dtype="int64")

        input_length = input_length * tf.ones(shape=(batch_len, 1), dtype="int64")
        label_length = label_length * tf.ones(shape=(batch_len, 1), dtype="int64")

        loss = self.loss_fn(y_true, y_pred, input_length, label_length)
        self.add_loss(loss)

        # At test time, just return the computed predictions
        return y_pred
labels = layers.Input(shape=(None,), dtype="float32")
input_layer = layers.Input(shape=(50,20))
layer = layers.Dense(123, activation = 'relu')
layer = layers.LSTM(128, return_sequences = True)(layer)
outputs = layers.Dense(20, activation='softmax')(layer)
output = CTCLayer()(labels,outputs)
model = Model(input_layer,outputs)

然而,当涉及到 model.fit 部分时,由于我不知道如何为模型提供“标签”输入层的东西,它开始分崩离析。我认为本教程中的方法非常明确,那么实现 CTC 损失的更好、更有效的方法是什么?

4

1 回答 1

1

您唯一做错的事情是模型创建model = Model(input_layer,outputs)应该是model = Model([input_layer,labels],output)tf.nn.ctc_loss如果​​您不想有 2 个输入,您也可以将模型编译为损失

def my_loss_fn(y_true, y_pred):
  loss_value = tf.nn.ctc_loss(y_true, y_pred, y_true_length, y_pred_length, 
  logits_time_major = False)
  return tf.reduce_mean(loss_value, axis=-1)

model.compile(optimizer='adam', loss=my_loss_fn)

像这样,请注意,上面的代码没有经过测试,您需要找到 y_pred 和 y_true 长度,但您可以像在 ctc 层中那样做

于 2021-09-28T12:44:30.910 回答