Skip to content

Commit

Permalink
Add more scheduler configurations
Browse files Browse the repository at this point in the history
  • Loading branch information
1pha committed Oct 16, 2023
1 parent 5de4c11 commit d878844
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 1 deletion.
12 changes: 12 additions & 0 deletions config/scheduler/cosine_anneal_warmup_long.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
scheduler:
_target_: sage.trainer.scheduler.CosineAnnealingWarmupRestarts
first_cycle_steps: 6000
cycle_mult: 2
max_lr: 4e-3
min_lr: 1e-7
warmup_steps: 4000
gamma: 0.5
last_epoch: -1
interval: step
frequency: 1
strict: False
7 changes: 7 additions & 0 deletions config/scheduler/steplr.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
scheduler:
_target_: torch.optim.lr_scheduler.StepLR
step_size: 30
gamma: 0.3
interval: epoch
frequency: 1
strict: False
1 change: 0 additions & 1 deletion sage/trainer/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,6 @@ def training_step(self, batch, batch_idx, optimizer_idx=None):
if self.log_train_metrics:
output: dict = self.train_metric(result["pred"], result["target"])
self.log_result(output=output, unit="step", prog_bar=False)

self.training_step_outputs.append(result)

if self.log_lr:
Expand Down

0 comments on commit d878844

Please sign in to comment.