You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question regarding the choice of learning rate and learing-rate scheduler. I the optimizer.yaml file, it set lr=0.0001
and lr_scheduler as
type: MultiStepLR
milestones: [1000]
gamma: 0.1
Is there a reason for setting such a small learning rate? Furthermore, the scheduler reduces learning rate at steps 1000 (by factor 0.1) which is pretty early in the entire training process?
The text was updated successfully, but these errors were encountered:
Dear author,
Thank you for the amazing work!!!
I have a question regarding the choice of learning rate and learing-rate scheduler. I the optimizer.yaml file, it set lr=0.0001
and lr_scheduler as
type: MultiStepLR
milestones: [1000]
gamma: 0.1
Is there a reason for setting such a small learning rate? Furthermore, the scheduler reduces learning rate at steps 1000 (by factor 0.1) which is pretty early in the entire training process?
The text was updated successfully, but these errors were encountered: