Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does this work on Stable Diffusion 2.1 from the diffusers library? #1

Open
johnrachwan123 opened this issue May 21, 2023 · 3 comments

Comments

@johnrachwan123
Copy link

No description provided.

@VainF
Copy link
Owner

VainF commented May 21, 2023

Not yet. But we plan to support Stable Diffusion in the following months. This repo currently works for Latent Diffusion Models (Pruning-only) and DDPMs (Pruning + Training)

@Shan-handsome
Copy link

Shan-handsome commented Jul 23, 2023

Thanks for the author's fantastic work.
I ran the code for LDM and found a runtime error:
"RuntimeError: CUDA error: device-side assert triggered"
It occurred at group_norm calculation (local_norm = local_norm[idxs])
Does there anyone have the same issue?

BTW, the script I ran is:
python ldm_prune.py
--model_path CompVis/ldm-celebahq-256
--save_path run/pruned/ldm_celeba_pruned
--pruning_ratio 0.05
--pruner magnitude
--device cuda:0
--batch_size 4

@VainF
Copy link
Owner

VainF commented Jul 23, 2023

Thanks for the author's fantastic work. I ran the code for LDM and found a runtime error: "RuntimeError: CUDA error: device-side assert triggered" It occurred at group_norm calculation (local_norm = local_norm[idxs]) Does there anyone have the same issue?

BTW, the script I ran is: python ldm_prune.py --model_path CompVis/ldm-celebahq-256 --save_path run/pruned/ldm_celeba_pruned --pruning_ratio 0.05 --pruner magnitude --device cuda:0 --batch_size 4

Hi @Shan-handsome, thanks for your feedback. We will check it ASAP.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants