-
Notifications
You must be signed in to change notification settings - Fork 21.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
3d Composability #129290
3d Composability #129290
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/129290
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 96e413e with merge base 7b1988f (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
test/distributed/_composable/test_composability/test_continuous.py
Outdated
Show resolved
Hide resolved
test/distributed/_composable/test_composability/test_continuous.py
Outdated
Show resolved
Hide resolved
test/distributed/_composable/test_composability/test_continuous.py
Outdated
Show resolved
Hide resolved
time python test/run_test.py --verbose -i distributed/pipelining/test_composability.py | ||
# 3D composability tests | ||
time python test/run_test.py --verbose -i distributed/_composable/test_composability/test_noncontinuous.py | ||
time python test/run_test.py --verbose -i distributed/_composable/test_composability/test_continuous.py |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could I know why they are new tests are called test_noncontinuous
and test_continuous
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The MultiProcContinousTest
at test_continuous
relates to MultiProcContinousTest
, thus name the test file as test_continuous
.
The others tests relate the old style test. test_noncontinuous
is not suitable, what do you think of test_composability_fsdp_tp
it might be more actionable if the PR is break down into 4 PRs, so each POC can comfortably review and accept for the change. Are you interested in leanring ghstack for this purpose? If it's too much trouble, we can discuss if we need to collect stamps from each POC before landing
|
Could u possible help me with this 3d face app |
pytorch (fsdp, tp, pp) -> pytorch (composable)
Move (fsdp, tp, pp) tests under pytorch into a composable folder
TP:
test/distributed/tensor/parallel/test_ddp_2d_parallel.py
test/distributed/tensor/parallel/test_fsdp_2d_parallel.py
PP:
test/distributed/pipelining/test_composability.py
FSDP:
test/distributed/_composable/fsdp/test_fully_shard_trainin.py
-TestFullyShard2DTraining
-TestFullyShardHSDPTraining
=>
distributed/_composable/test_composability/test_noncontinuous.py
distributed/_composable/test_composability/test_continuous.py
cc @mrshenli @pritamdamania87 @zhaojuanmao @satgera @gqchen @aazzolini @osalpekar @jiayisuse @H-Huang @kwen2501 @awgu @penguinwu @fegin @XilunWu @wanchaol @fduwjj @wz337 @tianyu-l @wconstab @yf225 @chauhang @d4l3k