当层改变输入变量的形状时,它必须通知后续层有关更改。对于这种情况,它必须具有自定义output_shape
属性。例如:
from neupy import layers
from neupy.utils import as_tuple
import theano.tensor as T
class Flatten(layers.BaseLayer):
"""
Slight modification of the Reshape layer from the neupy library:
https://github.com/itdxer/neupy/blob/master/neupy/layers/reshape.py
"""
@property
def output_shape(self):
# Number of output feature depends on the input shape
# When layer receives input with shape (10, 3, 4)
# than output will be (10, 12). First number 10 defines
# number of samples which you typically don't need to
# change during propagation
n_output_features = np.prod(self.input_shape)
return (n_output_features,)
def output(self, input_value):
n_samples = input_value.shape[0]
return T.reshape(input_value, as_tuple(n_samples, self.output_shape))
如果您在终端中运行它,您将看到它有效
>>> network = layers.Input((3, 4)) > Flatten()
>>> predict = network.compile()
>>> predict(np.random.random((10, 3, 4))).shape
(10, 12)
在您的示例中,我可以看到一些问题:
- 该
rbf
函数不返回 theano 表达式。它应该在函数编译期间失败
np.linalg.norm
如果您不指定要沿其计算范数的轴,则类似的函数将返回标量。
以下解决方案应该适合您
import numpy as np
from neupy import layers, init
import theano.tensor as T
def norm(value, axis=None):
return T.sqrt(T.sum(T.square(value), axis=axis))
class RBF(layers.BaseLayer):
def initialize(self):
super(RBF, self).initialize()
# It's more flexible when shape of the parameters
# denend on the input shape
self.add_parameter(
name='mean', shape=self.input_shape,
value=init.Constant(0.), trainable=True)
self.add_parameter(
name='std_dev', shape=self.input_shape,
value=init.Constant(1.), trainable=True)
def output(self, input_value):
K = input_value - self.mean
return T.exp(-norm(K, axis=0) / self.std_dev)
network = layers.Input(1) > RBF()
predict = network.compile()
print(predict(np.random.random((10, 1))))
network = layers.Input(4) > RBF()
predict = network.compile()
print(predict(np.random.random((10, 4))))