Skip to content

Commit

Permalink
Fix _free_full_params()
Browse files Browse the repository at this point in the history
I've been getting `AttributeError: 'FlatParameter' object has no attribute '_full_param_padded'` triggered by `p._full_param_padded.record_stream(current_stream)`

Adding a check to not free full params if none have been added.
  • Loading branch information
hadasah authored Nov 21, 2023
1 parent 164cc0f commit c5f8df9
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions fairscale/nn/data_parallel/fully_sharded_data_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -2096,6 +2096,8 @@ def _free_full_params(self, params: Optional[List[Parameter]] = None) -> None:
"""Free up storage for full parameters."""
if params is None:
params = self.params
if not self.has_full_params:
return
self.has_full_params = False
current_stream = torch.cuda.current_stream()
for p in params:
Expand Down

0 comments on commit c5f8df9

Please sign in to comment.