Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize inference with deep models #110

Open
ayushbaid opened this issue Mar 10, 2021 · 1 comment
Open

Optimize inference with deep models #110

ayushbaid opened this issue Mar 10, 2021 · 1 comment
Assignees
Labels
longterm Low priority issue to be revisited in the long term objective High level tasks to be accomplished

Comments

@ayushbaid
Copy link
Contributor

The current style of storing the models as private variables of a class in __init__, and reuse them in functions. This leads to trouble as the scheduler gets locked up frequently and also fails for moderately sized datasets.

The current hack is to read the model from disk every single time, but its inefficient and makes the program slow.

Dask's recommended strategy is to wrap the model in delayed and pass it to the function each time. Will need to try that and properly benchmark compute.

@johnwlambert
Copy link
Collaborator

@akshay-krishnan akshay-krishnan added longterm Low priority issue to be revisited in the long term objective High level tasks to be accomplished labels Sep 1, 2021
@ayushbaid ayushbaid self-assigned this Apr 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
longterm Low priority issue to be revisited in the long term objective High level tasks to be accomplished
Projects
None yet
Development

No branches or pull requests

3 participants