Skip to content

Commit

Permalink
fix(solver): fix gpu fused adamw condition (#196)
Browse files Browse the repository at this point in the history
  • Loading branch information
SolenoidWGT authored Apr 11, 2024
1 parent 7a1dcfd commit 616f4db
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion internlm/model/ops/fusion_ops_import_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,8 @@ def try_import_FusedAdamW():
backend = internlm_accelerator.get_accelerator_backend()
try:
if backend is AcceleratorType.GPU:
adam_extra_kwargs["fused"] = True
if torch.__version__ >= "2.1.0":
adam_extra_kwargs["fused"] = True

if gpc.is_rank_for_log():
logger.warning(
Expand Down

0 comments on commit 616f4db

Please sign in to comment.