Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add data checkpoint #29

Closed
wants to merge 12 commits into from
Closed

Add data checkpoint #29

wants to merge 12 commits into from

Conversation

floatingbigcat
Copy link
Collaborator

@floatingbigcat floatingbigcat commented Jun 20, 2023

Add data checkpoint feature within one epoch, also add support for openclipb32.
We split the total train shards into small sub-shards set, we train one sub-shards set for one interval between checkpoints.
Therefore we must hold certain constrains:

  • number of shards of sub-shards sets must larger than total workers(numboer of workers times world size of data parallel).
  • checkpoint save way must be linear for spliting the shards evenly.

To regenerate data iterator everytime we switch to a new sub-shard sets, I feed the dataloader to our train function and rebuild the data iterator upon it everytime we hit the save iteration.

@floatingbigcat floatingbigcat marked this pull request as draft June 21, 2023 08:00
@floatingbigcat floatingbigcat self-assigned this Jun 21, 2023
@floatingbigcat floatingbigcat marked this pull request as ready for review June 21, 2023 09:25
@floatingbigcat
Copy link
Collaborator Author

We use streaming dataset for supporting interleaved data input.
The solution based on webdataset will be abandoned.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant