0

我知道这个问题在 stackoverflow 和 github 中已经被多次提及,但在我的案例中似乎没有任何问题可以解决。我正在尝试使用 tensorflow_probability moodules 的变分自动编码器。在使用model.fit方法训练模型时,我看到在第一个时期,有一个警告,即Gradient Do not exist for the weights and biases variables of the decoder when minimizing the loss. 令人费解的部分是从第二个纪元开始,一切看起来都很正常并得到了输出。任何帮助将不胜感激以解决此问题。为了完整起见,提供了解码器和警告。

dec_klreg = Sequential([
                  InputLayer(input_shape=[latent_size])]+
                 [Dense(unit, activation='relu') for unit in units[::-1]]+
                 [Dense(2*input_shape),
                  tfpl.DistributionLambda(
                      lambda t: tfd.MultivariateNormalDiag(
                          loc=t[...,:input_shape],
                          scale_diag = tf.math.exp(t[...,input_shape:])
                      )
                  )
]) 

这是摘要。

Model: "sequential_4"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_16 (Dense)             (None, 8)                 24        
_________________________________________________________________
dense_17 (Dense)             (None, 16)                144       
_________________________________________________________________
dense_18 (Dense)             (None, 32)                544       
_________________________________________________________________
dense_19 (Dense)             (None, 100)               3300      
_________________________________________________________________
distribution_lambda_3 (Distr multiple                  0         
=================================================================
Total params: 4,012
Trainable params: 4,012
Non-trainable params: 0

这是 fit 方法:

    vae_klreg = tf.keras.Model(inputs = enc_klreg.input, outputs = dec_klreg(enc_klreg.output))
    vae_klreg.compile(optimizer= tf.keras.optimizers.Adam(learning_rate=0.001), 
            loss = loss_mcmc_klreg)
    history = vae_klreg.fit(x_train,epochs=100,batch_size=1000,validation_split=0.1)

**enc_klreg 是编码器。

这是警告:

Epoch 1/100
WARNING:tensorflow:Gradients do not exist for variables ['dense_16/kernel:0', 'dense_16/bias:0', 'dense_17/kernel:0', 'dense_17/bias:0', 'dense_18/kernel:0', 'dense_18/bias:0', 'dense_19/kernel:0', 'dense_19/bias:0'] when minimizing the loss.
WARNING:tensorflow:Gradients do not exist for variables ['dense_16/kernel:0', 'dense_16/bias:0', 'dense_17/kernel:0', 'dense_17/bias:0', 'dense_18/kernel:0', 'dense_18/bias:0', 'dense_19/kernel:0', 'dense_19/bias:0'] when minimizing the loss.
9/9 [==============================] - 1s 39ms/step - loss: 1858.8081 - val_loss: 458.4520
Epoch 2/100
9/9 [==============================] - 0s 6ms/step - loss: 411.9823 - val_loss: 273.5398
Epoch 3/100
9/9 [==============================] - 0s 6ms/step - loss: 281.3841 - val_loss: 208.0804
4

0 回答 0