这是我的代码:
import tensorflow as tf
with tf.Session() as sess:
y = tf.constant([0,0,1])
x = tf.constant([0,1,0])
r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
sess.run()
print(r.eval())
它会产生以下错误:
ValueError Traceback (most recent call last)
<ipython-input-10-28a8854a9457> in <module>()
4 y = tf.constant([0,0,1])
5 x = tf.constant([0,1,0])
----> 6 r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
7 sess.run()
8 print(r.eval())
~\AppData\Local\conda\conda\envs\tensorflow\lib\site-packages\tensorflow\python\ops\nn_ops.py in sparse_softmax_cross_entropy_with_logits(_sentinel, labels, logits, name)
1687 raise ValueError("Rank mismatch: Rank of labels (received %s) should "
1688 "equal rank of logits minus 1 (received %s)." %
-> 1689 (labels_static_shape.ndims, logits.get_shape().ndims))
1690 # Check if no reshapes are required.
1691 if logits.get_shape().ndims == 2:
ValueError: Rank mismatch: Rank of labels (received 1) should equal rank of logits minus 1 (received 1).
有人可以帮我理解这个错误吗?如何手动计算 softmax 和计算交叉熵是相当直接的。
另外,我将如何使用此功能,我需要将批处理输入其中(2 个昏暗数组)?
更新
我也试过:
import tensorflow as tf
with tf.Session() as sess:
y = tf.constant([1])
x = tf.constant([0,1,0])
r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)
sess.run()
print(r.eval())
它产生了同样的错误