Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning rate and learning rate scheduler #531

Open
dtdo90 opened this issue Jan 4, 2025 · 1 comment
Open

Learning rate and learning rate scheduler #531

dtdo90 opened this issue Jan 4, 2025 · 1 comment
Assignees

Comments

@dtdo90
Copy link

dtdo90 commented Jan 4, 2025

Dear author,

Thank you for the amazing work!!!

I have a question regarding the choice of learning rate and learing-rate scheduler. I the optimizer.yaml file, it set lr=0.0001
and lr_scheduler as

type: MultiStepLR
milestones: [1000]
gamma: 0.1

Is there a reason for setting such a small learning rate? Furthermore, the scheduler reduces learning rate at steps 1000 (by factor 0.1) which is pretty early in the entire training process?

@lyuwenyu
Copy link
Owner

Yes, It means we keep fixed lr in training process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants