3

我使用带有 Tensorflow 的 Keras 作为后端并得到不兼容的错误:

model = Sequential()
model.add(LSTM(64, input_dim = 1))
model.add(Dropout(0.2))
model.add(LSTM(16))

以下错误显示:

Traceback (most recent call last):
  File "train_lstm_model.py", line 36, in <module>
    model.add(LSTM(16))
  File "/home/***/anaconda2/lib/python2.7/site-packages/keras/models.py", line 332, in add
    output_tensor = layer(self.outputs[0])
  File "/home/***/anaconda2/lib/python2.7/site-packages/keras/engine/topology.py", line 529, in __call__
    self.assert_input_compatibility(x)
  File "/home/***/anaconda2/lib/python2.7/site-packages/keras/engine/topology.py", line 469, in assert_input_compatibility
    str(K.ndim(x)))
ValueError: Input 0 is incompatible with layer lstm_2: expected ndim=3, found ndim=2

我该如何解决这个问题?

Keras 版本:1.2.2 TensorFlow 版本:0.12

4

1 回答 1

5

LSTM层正在接受形状为 的输入(len_of_sequences, nb_of_features)。您提供的输入形状只是1-dim错误的来源。错误消息的确切形式来自于数据的实际形状包括batch_size. 因此,馈送到层的数据的实际形状是(batch_size, len_of_sequences, nb_of_features)。您的形状是,这就是vs输入(batch_size, 1)背后的原因。3d2d

此外 - 您可能对第二层有类似的问题。为了使您的LSTM层返回一个序列,您应该将其定义更改为:

model.add(LSTM(64, input_shape = (len_of_seq, nb_of_features), return_sequences=True)

或者:

model.add(LSTM(64, input_dim = nb_of_features, input_len = len_of_sequence, return_sequences=True)
于 2017-02-20T10:19:41.920 回答