tensorflow softmax_cross_entropy_with_logits
softmax_cross_entropy_with_logits issue 해결
WARNING:tensorflow:From /Users/os/woong/tensorflow/DeepLearningZeroToAll/lab-06-2-softmax_zoo_classifier.py:32: softmax_cross_entropy_with_logits (from tensorflow.python.ops.nn_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Future major versions of TensorFlow will allow gradients to flow
into the labels input on backprop by default.
다음 Warning 해결
softmax_cross_entropy_with_logit 함수를
softmax_cross_entropy_with_logits_v2로 변경하고
파라미터의 labels를 tf.stop_gradient에 넣어서 해결
기존 함수
cost_i = tf.nn.softmax_cross_entropy_with_logits(logits=logits,
labels=Y_one_hot)
해결
cost_i = tf.nn.softmax_cross_entropy_with_logits_v2(logits=logits,
labels=tf.stop_gradient([Y_one_hot]))
c
댓글
댓글 쓰기