You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it seems that the --f 8 argument make it ambiguous, giving an error like this:
accelerate <command> [<args>] launch: error: ambiguous option: --f could match --fsdp_offload_params, --fsdp_min_num_params, --fsdp_sharding_strategy, --fsdp_auto_wrap_policy, --fsdp_transformer_layer_cls_to_wrap, --fsdp_backward_prefetch_policy, --fsdp_state_dict_type, --fp16
When adding an option --use_ema, there is another error:
Traceback (most recent call last):
File "/srv/username/LFM/train_flow_latent.py", line 342, in <module>
train(args)
File "/srv/username/LFM/train_flow_latent.py", line 93, in train
data_loader, model, optimizer, scheduler = accelerator.prepare(data_loader, model, opti
mizer, scheduler)
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/accelerate/accel
erator.py", line 621, in prepare
result = tuple(self._prepare_one(obj, first_pass=True) for obj in args)
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/accelerate/accelerator.py", line 621, in <genexpr>
result = tuple(self._prepare_one(obj, first_pass=True) for obj in args)
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/accelerate/accelerator.py", line 520, in _prepare_one
optimizer = self.prepare_optimizer(obj)
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/accelerate/accelerator.py", line 854, in prepare_optimizer
optimizer = AcceleratedOptimizer(optimizer, device_placement=self.device_placement, scaler=self.scaler)
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/accelerate/optim
izer.py", line 70, in __init__
self.optimizer.load_state_dict(state_dict)
File "/srv/username/LFM/EMA.py", line 66, in load_state_dict
super(EMA, self).load_state_dict(state_dict)
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/torch/optim/opti
mizer.py", line 433, in load_state_dict
self.__setstate__({'state': state, 'param_groups': param_groups})
File "/srv/username/miniconda3/envs/LFMorig/lib/python3.10/site-packages/torch/optim/opti
mizer.py", line 214, in __setstate__
self.defaults.setdefault('differentiable', False)
AttributeError: 'EMA' object has no attribute 'defaults'
Could you please kindly share your torch and accelerate version for running the scripts? Thank you!
The text was updated successfully, but these errors were encountered:
Hi, thank you for your great work!
I was running the code with torch 2.0.0 and accelerate 0.12.0. I have encountered two following errors:
bash_scripts/run.sh
with the first line:it seems that the
--f 8
argument make it ambiguous, giving an error like this:--use_ema
, there is another error:Could you please kindly share your torch and accelerate version for running the scripts? Thank you!
The text was updated successfully, but these errors were encountered: