我想转换形式的张量(称之为logits)
int32 - [batch_size]
到形式的张量(称为标签)
[batch_size, 10]
例如对于 batch_size=3
logits=[1,6,9]
labels=[[0,1,0,0,0,0,0,0,0,0],
[0,0,0,0,0,0,1,0,0,0],
[0,0,0,0,0,0,0,0,0,1]]
出现这个问题是因为我想在 tensorflow mnist 示例( https://github.com/tensorflow/tensorflow/tree/r0.9/tensorflow/examples/tutorials/mnist )中将成本函数更改为二次函数我使用fully_connected_feed .py 和 mnist.py 中。在 mnist.py 我想改变:
cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(logits, labels, name='xentropy')
loss = tf.reduce_mean(cross_entropy, name='xentropy_mean')
至
loss= tf.reduce_sum(tf.squared_difference(logits,labels))
但问题在于:
Logits tensor, float - [batch_size, 10];
Labels tensor, int64 - [batch_size].
所以我需要“矢量化”标签!?有谁知道如何做到这一点?