-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory error using Matrix.get with many observations #465
Comments
Thanks @nikobenho . Ive considered swapping to sparse storage for a while now bc it does handle really large cases better (or at least using sparse for a few of the more critical pinch points like |
Maybe a similar issue(?), but I am hitting memory issues when I try to build a parcov using an .unc file with about 150,000 pars (small sample of the file below):
Curious if there is an easy way to build these as a bunch of smaller "blocks" and then combined into a block diagonal matrix to get around the memory issue? Or if that is already what is being done and I am just out of luck. |
The pyemu helpers already do the group/block based side-stepping trick to avoid having to form that full matrix for drawing realizations. The problem is that the Cov object holds the matrix as dense in mem so its going to be huge and mostly zeros... |
I ran into an out of memory error using the
Matrix.get
method in a case where I have a lot of zero-weighted observations that I would like to carry through history-matching as metadata. It looks like the error occurs when np.diag is called:It seems modifying the method to use scipy.sparse.diags can offer a workaround:
Comparing the methods using a smaller amount of observations (allowing the use of both
get
andspget
):I wanted to bring this to attention because quite a lot of use-cases seem to rely on this method. Although I'm not sure whether this modification has any unforseen impact further down the line.
The text was updated successfully, but these errors were encountered: