Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add length and padding keyworks to DistributedSampler #28841

Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Fix sampler __len__ method
  • Loading branch information
Thiago Crepaldi committed Oct 29, 2019
commit e2ea2938cee4159e9ccda5e255df1da5590b8451
4 changes: 3 additions & 1 deletion torch/utils/data/distributed.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,9 @@ def __iter__(self):
return iter(indices)

def __len__(self):
return self.num_samples
if self.padding:
return self.num_samples
return self._dataset_length
thiagocrepaldi marked this conversation as resolved.
Show resolved Hide resolved

def set_rank(self, rank):
thiagocrepaldi marked this conversation as resolved.
Show resolved Hide resolved
assert rank >= 0, 'rank must be >= 0'
Expand Down