You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I change the "l" crosspond to the training step, and let angular-softmax with a very small part at beginning and about 0.1 ratio at the end. however, it seems strange, the loss decrease normally at beginning and start increase at some training step, with a-softmax increase. and the acc decrease too, have anyone meet this problem? If anyone have some trick about the raito decay? ps: I have fixed the code to avoid gradient explosion problem.
hello, I am a bit of confuse about reading this code about calculating A-softloss
The text was updated successfully, but these errors were encountered: