新答案,将 layer2 视为 (50,49)
在这里,您需要对 layer2 中的每一行进行标量乘法。然后我们将考虑将“50”作为批次的一部分,并将形状 (1,1) 与形状 (49,1) 相乘。为了在 50 中保持独立batch_dot
,我们将使用-1
通配符重塑 lambda 函数内部的内容:
out = Lambda(myMultiplication, output_shape=(50,49))([layer1,layer2])
在哪里
import keras.backend as K
def myMultiplication(x):
#inside lambda functions, there is an aditional axis, the batch axis. Normally, we use -1 for this dimension. We can take advantage of it and simply hide the unwanted 50 inside this -1.
L1 = K.reshape(x[0], (-1,1,1))
L2 = K.reshape(x[1], (-1,49,1))
result = K.batch_dot(L1,L2, axes=[1,2])
#here, we bring the 50 out again, keeping the batch dimension as it was originally
return K.reshape(result,(-1,50,49))
旧答案,当我认为 layer2 是 (49,) 而不是 (50,49)
您需要一个带有batch_dot
.
批处理点是实际的矩阵乘法,而乘法是元素乘法。为此,您应该将向量重塑为矩阵,将其中之一转置以实现所需的乘法。
所以:
layer1 = Reshape((1,50))(layer1)
layer2 = Reshape((49,1))(layer2)
out = Lambda(myMultiplication, output_shape=(50,49))([layer1,layer2])
在哪里
import keras.backend as K
def myMultiplication(x):
return K.batch_dot(x[0],x[1],axes=[1,2])