tensorflow softmax_cross_entropy_with_logits

tensorflow softmax_cross_entropy_with_logits

softmax_cross_entropy_with_logits issue 해결

WARNING:tensorflow:From /Users/os/woong/tensorflow/DeepLearningZeroToAll/lab-06-2-softmax_zoo_classifier.py:32: softmax_cross_entropy_with_logits (from tensorflow.python.ops.nn_ops) is deprecated and will be removed in a future version.
Instructions for updating:

Future major versions of TensorFlow will allow gradients to flow
into the labels input on backprop by default.

다음 Warning 해결
softmax_cross_entropy_with_logit 함수를
softmax_cross_entropy_with_logits_v2로 변경하고
파라미터의 labels를 tf.stop_gradient에 넣어서 해결

기존 함수

cost_i = tf.nn.softmax_cross_entropy_with_logits(logits=logits,
labels=Y_one_hot)

해결

cost_i = tf.nn.softmax_cross_entropy_with_logits_v2(logits=logits,
labels=tf.stop_gradient([Y_one_hot]))
c

댓글

이 블로그의 인기 게시물

CGV 상영시간표 알리

[Swift Error] Outlets cannot be connected to repeating content.