-
Notifications
You must be signed in to change notification settings - Fork 21.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
InternalTorchDynamoError on KL Divergences #120497
Comments
@yanboliang I think this is something that probably can be fixed by modifying the skipfiles? |
No, this is because we don't support arbitrary user defined class as dict keys. The complicated part is how to generate guards of pytorch/torch/_dynamo/guards.py Line 116 in a358b23
I can send a fix for this. |
Seems to work on main
|
🐛 Describe the bug
PyTorch throws an
InternalTorchDynamoError
exception when compiling the computation of a KL divergence, independently of the devices of the inputs. I am working with VAEs so supporting KL divergences would be a nice addition to PyTorch compilation! Thanks!Error logs
trace.txt
Minified repro
Versions
env.txt
cc @ezyang @msaroufim @bdhirsh @anijain2305 @zou3519 @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @aakhundov @kadeng
The text was updated successfully, but these errors were encountered: