tensorflow 中 softmax_cross_entropy_with_logits 与 sparse_softmax_cross_entropy_with_logits 的区别
http://stackoverflow.com/questions/37312421/tensorflow-whats-the-difference-between-sparse-softmax-cross-entropy-with-logi
Having two different functions is a convenience, as they produce the same result.
The difference is simple:
- For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range[0, num_classes-1].
- For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.
Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.
Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.
    
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
支付宝扫一扫捐赠 
支付宝扫一扫捐赠
 
 微信公众号:    共鸣圈 
 欢迎讨论,邮件:    924948$qq.com         请把$改成@
 QQ群:263132197 
 QQ:    924948
 
                    
                 
                
            
         浙公网安备 33010602011771号
浙公网安备 33010602011771号