I'm now trying to use tf.losses.sigmoid_cross_entropy
on an unbalanced dataset. However, I'm a little confused on the parameter weights. Here are the comments in the documentation:
weights: Optional Tensor whose rank is either 0, or the same rank as labels, and must be broadcastable to labels (i.e., all dimensions must be either 1, or the same as the corresponding losses dimension).
I know in tf.losses.softmax_cross_entropy
the parameter weights can be a rank 1 tensor with weight for each sample. Why must the weights in tf.losses.sigmoid_cross_entropy
have the same rank as labels?
Can anybody answer me? Better with an example.