Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prediction time is too slow #32

Open
GitVHV opened this issue Feb 10, 2023 · 2 comments
Open

Prediction time is too slow #32

GitVHV opened this issue Feb 10, 2023 · 2 comments

Comments

@GitVHV
Copy link

GitVHV commented Feb 10, 2023

I see an issue coming from text2mel using only 1 cpu and mel2wave being all cpu. Do you have any solution to this problem to optimize processing time? Thank you so much!

@Approximetal
Copy link

Reason see this issue google/jax#2521
I use pytorch version hifigan instead of haiku+jax, it is much faster than previous.
I'm trying to fix text2mel part to solve the recompilation issue, but not finished yet.

@GitVHV
Copy link
Author

GitVHV commented Apr 20, 2023

Reason see this issue google/jax#2521 I use pytorch version hifigan instead of haiku+jax, it is much faster than previous. I'm trying to fix text2mel part to solve the recompilation issue, but not finished yet.

I also tried running with pytorch but it seems that the input of text2mel doesn't match. Let me know when you get a handle on it. Looking forward to your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants