-
Notifications
You must be signed in to change notification settings - Fork 404
Selective Sampler memory error #40
Comments
Thanks for the feedback, the sampler is calling |
I have the mote update version of scipy. I tried smaller spatial size and the batch size was already set to 1. Currently it is running a sampler size application after I changed the label normalisation to true (spatial window size (32, 32, 32)). However, it is terribly slow. It is trying to find locations with a sample per volume of 1. Is there anything more I can do to at least speed this up? Its just computing these sampling locations over and over again. Im guessing this is normal but it takes ages. I added the script of the computation. final calculated for value in list_labels |
Dear NiftyNet,
I am running a selective sampler with VNet and I get the following error message:
INFO:niftynet: Parameters from random initialisations ...
list labels is (0, 1)
262144 0.001
list labels is (0, 1)
262144 0.001
list labels is (0, 1)
262144 0.001
list labels is (0, 1)
262144 0.001
final calculated for value in list_labels
final calculated for value in list_labels
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/niftynet/engine/image_window_buffer.py", line 143, in _push
for output_dict in self():
File "/usr/local/lib/python3.5/dist-packages/niftynet/engine/sampler_uniform.py", line 83, in layer_op
self.window.n_samples)
File "/usr/local/lib/python3.5/dist-packages/niftynet/contrib/segmentation_selective_sampler/sampler_selective.py", line 81, in spatial_coordinates_function
win_sizes['label'], data['label'], self.constraint)
File "/usr/local/lib/python3.5/dist-packages/niftynet/contrib/segmentation_selective_sampler/sampler_selective.py", line 183, in candidate_indices
counts_window = fftconvolve(seg_label, window_ones, 'same')
File "/usr/local/lib/python3.5/dist-packages/scipy/signal/signaltools.py", line 393, in fftconvolve
ret = (np.fft.irfftn(sp1 * sp2, fshape)[fslice].copy())
File "/home/ubuntu/.local/lib/python3.5/site-packages/numpy/fft/fftpack.py", line 1231, in irfftn
a = ifft(a, s[ii], axes[ii], norm)
File "/home/ubuntu/.local/lib/python3.5/site-packages/numpy/fft/fftpack.py", line 288, in ifft
return output * (1 / (sqrt(n) if unitary else n))
MemoryError
I have 50 gb of memory but this isnt enough for some reason. I use the following relevant hyperparameters:
spatial_window_size = (128, 128, 128) (training and validation)
optimiser = adam
sample_per_volume = 1
lr = 0.0001
loss_type = GDSC
image = volume
label = segmentation
output_prob = True
num_classes = 2
label_normalisation = False
rand_samples = 0
min_numb_labels = 1
min_sampling_ratio = 0.001
compulsory_labels = 0,1
This error occurs not only for VNet. Ive been trying to get selective sampler to work on different networks as well. I use the big spatial window size because the amount of foreground is small in comparison to the background (also the reason I use selective sampler). Decreasing the spatial window size to (32, 32, 32) did not change anything and resulted in the same error.
Can you explain to me why so much memory is used and what I can do to prevent this error?
The text was updated successfully, but these errors were encountered: