You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I try to use my own data set ( with the same shape as mnist) in code. After some iterations, it is found that the training loss became NAN. After carefully checking the code, I found that the following code may trigger NAN in loss:
In /python/algorithms/Classification Algorithms/MNIST Digit cassification using CNN.py: line 245
Hello~
Thank you very much for sharing the code!
I try to use my own data set ( with the same shape as mnist) in code. After some iterations, it is found that the training loss became NAN. After carefully checking the code, I found that the following code may trigger NAN in loss:
In /python/algorithms/Classification Algorithms/MNIST Digit cassification using CNN.py: line 245
If y contains 0 (output of softmax ), the result of tf.log(y) is inf because log(0) is illegal . And this may cause the result of loss to become NAN.
It could be fixed by making the following changes:
or
The same problem was also found at line 645:
Hope to hear from you ~
Thanks in advance! : )
The text was updated successfully, but these errors were encountered: