0

我正在尝试为此架构编写代码(问答模型:论文https://www.hindawi.com/journals/cin/2019/9543490/)并寻求帮助如何从堆叠的 BiLSTM 中获取隐藏状态矩阵 Hq 和 Ha层。有人可以请教。

在此处输入图像描述

在此处输入图像描述

# Creating Embedding Layer for Query
# Considered fixed length as 40 for both question and answer as per research paper
embedding_layer1 = layers.Embedding(vocab_size_query, 300, weights=[embedding_matrix_query], input_length =40, trainable=False)
input_text1 =Input(shape=(40,), name="input_text")
x = embedding_layer1(input_text1)

# Creating Bidirectional layer for Query
# Each word in the context and question should be made aware of the nearby words occurring. We use a bi-directional recurrent neural network (LSTM’s) here.
x = Bidirectional(LSTM(128,recurrent_dropout=0.5,kernel_regularizer=regularizers.l2(0.001),return_sequences=True))(x)
x = Bidirectional(LSTM(128,recurrent_dropout=0.5,kernel_regularizer=regularizers.l2(0.001),return_sequences=True))(x)
flatten_1 = Flatten()(x)

## Creating Embedding Layer for Passage
embedding_layer2 = layers.Embedding(vocab_size_answer, 300, weights=[embedding_matrix_answer], input_length =40, trainable=False)
input_text2 =Input(shape=(40,), name="input_text")
x2 = embedding_layer2(input_text2)

# Creating Bidirectional layer for Passage
x2 = Bidirectional(LSTM(128,recurrent_dropout=0.5,kernel_regularizer=regularizers.l2(0.001),return_sequences=True))(x2)
x2 = Bidirectional(LSTM(128,recurrent_dropout=0.5,kernel_regularizer=regularizers.l2(0.001),return_sequences=True))(x2)
flatten_2 = Flatten()(x2)

在此处输入图像描述

4

1 回答 1

1

根据模型结构和您的源代码,您可以通过提取 flatten_1 和 flatten_2 层的输出来获得 Hq 和 Ha。要提取中间层的输出,您可以创建一个新模型,输入作为原始输入,输出作为适当的层。

from tensorflow.keras.models import Model

model = ...  # create the original model

layer_name = 'my_layer'
intermediate_layer_model = Model(inputs=model.input,
                                 outputs=model.get_layer(layer_name).output)
intermediate_output = intermediate_layer_model.predict(data)
于 2020-02-06T02:53:23.747 回答