Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Large-BS Dataloader Bug #835

Merged
merged 6 commits into from
Mar 16, 2023
Merged

Fix Large-BS Dataloader Bug #835

merged 6 commits into from
Mar 16, 2023

Conversation

Quentin-Anthony
Copy link
Member

@Quentin-Anthony Quentin-Anthony commented Mar 14, 2023

Currently, if train_iters * seqlen * gradient_accumulation_steps * micro_batch_size * world_size > 2147483647 our dataloader's sample_idx overflows leading to the cryptic error:

  File "torch/utils/data/_utils/collate.py", line 141, in default_collate
    return torch.stack(batch, 0, out=out)
RuntimeError: stack expects each tensor to be equal size, but got [2049] at entry 0 and [198705720] at entry 16

Which will happen once the torch dataloader reaches the overflowed sample_idx. Simply storing to np.int64 instead of np.int32 will do the job, but will waste memory and disk. I added a simple switch between the default int32 and new int64 dataset builders and tested it works.

@Quentin-Anthony Quentin-Anthony requested a review from a team as a code owner March 14, 2023 18:49
@Quentin-Anthony Quentin-Anthony marked this pull request as draft March 14, 2023 18:49
@Quentin-Anthony Quentin-Anthony marked this pull request as ready for review March 15, 2023 04:56
@Quentin-Anthony
Copy link
Member Author

I went the easy route and just created two separate dataset building functions (build_sample_idx_int32 and build_sample_idx_int64), then switch between them depending on whether the number of samples will overflow with int32.

Now we won't waste memory with int64 for the majority of runs that won't overflow with int32. Tested with both overflowing and non-overflowing cases without issue.

StellaAthena
StellaAthena previously approved these changes Mar 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants