Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is log_softmax needed for activations on GPU to use your Transducer Loss? cf https://github.com/HawkAaron/E2E-ASR/blob/04d416b1c32a8cbe55aa7527cfce25739339cbd5/model.py#L83 #7

Open
cweng6 opened this issue Jun 9, 2019 · 2 comments

Comments

@cweng6
Copy link

cweng6 commented Jun 9, 2019

No description provided.

@cweng6 cweng6 changed the title Is log_softmax needed for activations on GPU to use your Transducer Loss? Is log_softmax needed for activations on GPU to use your Transducer Loss? cf https://github.com/HawkAaron/E2E-ASR/blob/04d416b1c32a8cbe55aa7527cfce25739339cbd5/model.py#L83 Jun 9, 2019
@HawkAaron
Copy link
Owner

No need.

@cweng6
Copy link
Author

cweng6 commented Jun 10, 2019

hmm, but in the latest version of pytorch bindings, it seems only CPU it doesnot log_softmax, see,

https://github.com/HawkAaron/warp-transducer/blob/05c524ef3d8601ed48813b417c820cbdeb023b45/pytorch_binding/warprnnt_pytorch/__init__.py#L67

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants