0

考虑取自此处的以下代码:

import tensorflow
import transformers
import tensorflow_datasets
from tensorflow.keras import callbacks, losses, metrics, optimizers

data = tensorflow_datasets.load("glue/mrpc")
train_dataset = data["train"]
validation_dataset = data["validation"]

bert_model = transformers.TFBertModel.from_pretrained("bert-base-cased")
bert_tokenizer = transformers.BertTokenizer.from_pretrained("bert-base-cased")

tensorboard_callback = callbacks.TensorBoard(write_images=True, embeddings_freq=1)
bert_model.compile(optimizer=optimizers.Adam(learning_rate=0.01), loss=losses.SparseCategoricalCrossentropy(from_logits=True), metrics=[metrics.SparseCategoricalAccuracy("accuracy")])
bert_history = bert_model.fit(train_dataset, epochs=10, validation_data=(validation_dataset), callbacks=[tensorboard_callback])

当我运行它时,它会给出以下错误:

ValueError: in converted code:

    C:\tools\miniconda3\lib\site-packages\tensorflow_core\python\keras\engine\training_v2.py:677 map_fn
        batch_size=None)
    C:\tools\miniconda3\lib\site-packages\tensorflow_core\python\keras\engine\training.py:2469 _standardize_tensors
        exception_prefix='target')
    C:\tools\miniconda3\lib\site-packages\tensorflow_core\python\keras\engine\training_utils.py:539 standardize_input_data
        str(data)[:200] + '...')

    ValueError: Error when checking model target: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 2 array(s), for inputs ['output_1', 'output_2'] but instead got the following list of 1 arrays: [<tf.Tensor 'args_3:0' shape=() dtype=int64>]...

我不明白,因为它应该工作。有什么建议么?

4

0 回答 0