4

我希望能够将几层放在一起,但在指定输入之前,如下所示:

# conv is just a layer, no application
conv = Conv2D(64, (3,3), activation='relu', padding='same', name='conv')
# this doesn't work:
bn = BatchNormalization()(conv)

请注意,如果可以避免,我不想指定输入或其形状,我想稍后将其用作多个输入的共享层。

有没有办法做到这一点?上面给出了以下错误:

>>> conv = Conv2D(64, (3,3), activation='relu', padding='same', name='conv')
>>> bn = BatchNormalization()(conv)
Traceback (most recent call last):
  File "/home/mitchus/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/topology.py", line 419, in assert_input_compatibility
    K.is_keras_tensor(x)
  File "/home/mitchus/anaconda3/envs/tf/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py", line 393, in is_keras_tensor
    raise ValueError('Unexpectedly found an instance of type `' + str(type(x)) + '`. '
ValueError: Unexpectedly found an instance of type `<class 'keras.layers.convolutional.Conv2D'>`. Expected a symbolic tensor instance.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/mitchus/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/topology.py", line 552, in __call__
    self.assert_input_compatibility(inputs)
  File "/home/mitchus/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/topology.py", line 425, in assert_input_compatibility
    str(inputs) + '. All inputs to the layer '
ValueError: Layer batch_normalization_4 was called with an input that isn't a symbolic tensor. Received type: <class 'keras.layers.convolutional.Conv2D'>. Full input: [<keras.layers.convolutional.Conv2D object at 0x7f3f6e54b748>]. All inputs to the layer should be tensors.

抓取 conv 层的输出也不能解决问题:

>>> bn = BatchNormalization()(conv.output)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/mitchus/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/topology.py", line 941, in output
    ' has no inbound nodes.')
AttributeError: Layer conv has no inbound nodes.
4

3 回答 3

4

尝试这个:

def create_shared_layers():
    layers = [
        Conv2D(64, (3,3), activation='relu', padding='same', name='conv'),
        BatchNormalization()
    ]
    def shared_layers(x):
        for layer in layers:
            x = layer(x)
        return x
    return shared_layers

稍后,您可以执行以下操作:

shared_layers = create_shared_layers()
...
h1 = shared_layers(x1)
h2 = shared_layers(x2)
于 2017-08-01T16:44:13.710 回答
1

What about using a Lambda layer.

import functools
from typing import List

from tensorflow import keras


def compose_layers(layers: List[keras.layers.Layer], **kargs) -> keras.layers.Layer:
  return keras.layers.Lambda(
    lambda x: functools.reduce(lambda tensor, layer: layer(tensor), layers, x),
    **kargs,
  )

then you can just call the compose_layers method to get the composition.

layers = [
  Conv2D(64, (3,3), activation='relu', padding='same', name='conv'),
  BatchNormalization()
]

composed_layers = compose_layers(layers, name='composed_layers')
于 2019-08-09T16:15:31.743 回答
0

您也可以将tk.Sequential其视为图层

import tensorflow.keras as tk
import tensorflow as tf

_layer1 = tk.layers.Conv2D(
    64, (3,3), activation='relu', 
    padding='same', name='conv'
)
_layer2 = tk.layers.BatchNormalization()

_composed_layer = tk.Sequential(
    [_layer1, _layer2]
)

_some_input = tf.random.normal((100,20,33,2))
_out = _composed_layer(_some_input)
于 2021-08-10T15:08:50.670 回答