Skip to content
This repository has been archived by the owner on Mar 17, 2021. It is now read-only.

Cannot perform evaluation over PROMISE12 dataset #332

Open
mariamhrr opened this issue Mar 20, 2019 · 3 comments
Open

Cannot perform evaluation over PROMISE12 dataset #332

mariamhrr opened this issue Mar 20, 2019 · 3 comments

Comments

@mariamhrr
Copy link

mariamhrr commented Mar 20, 2019

Hello, I have trained my segmentation network over the PROMISE12 dataset using the dense_vnet, but when I try to perform evaluation between the PROMISE12 groundtruth and my inferred volumes there appears this error:

File "C:\Users\NiftyNet\niftynet\evaluation\segmentation_evaluations.py", line 160, in metric_from_binarized
conf_mat = np.array([[np.sum(land(lnot(seg), lnot(ref))),
ValueError: operands could not be broadcast together with shapes (320,320,20,1,1) (512,512,54,1,1)

I understand that its because the PROMISE12 files have (512,512,54,1,1) dimension and my inferred volumes have (320,320,20,1,1) dimension. Shouldn't it be doing resize automatically? Also, it is strange to me that the inferred volumes have (512,512,54,1,1) dimension when I specified them to be (64,64,64,1,1) on my config.ini file. Thanks for the help!

Ps. here is my config file:

[seg]
path_to_search = C:/Users/INFERENCE/5000
filename_contains = _niftynet_out
filename_not_contains =
spatial_window_size = (64,64,64)
axcodes=(S,L,P)
interp_order = 3
[label]
path_to_search = C:/Users/Promise12_LABELS
filename_contains = Case, _segmentation
filename_not_contains =
spatial_window_size = (64,64,64)
axcodes=(S,L,P)
interp_order = 0

[SYSTEM]
cuda_devices = ""
num_threads = 2
num_gpus = 1
model_dir = C:/Users/models

[NETWORK]
name = vnet
activation_function = prelu
batch_size = 1
volume_padding_size = 0
histogram_ref_file = ../standardisation_models.txt
norm_type = percentile
cutoff = (0.01, 0.99)
normalisation = True
whitening = True
normalise_foreground_only=True
foreground_type = otsu_plus
multimod_foreground_type = and
window_sampling = resize
queue_length = 8

[INFERENCE]
save_seg_dir = ./INFERENCE/5000
output_interp_order = 3
spatial_window_size = (64,64,64)
inference_iter = 5000

[EVALUATION]
save_csv_dir = C:/Users/models
evaluations = dice,jaccard

[SEGMENTATION]
inferred = seg
label = label
output_prob = False
num_classes = 2
label_normalisation = True

@mariamhrr mariamhrr changed the title Cannot perform evaluation on PROMISE12 dataset Cannot perform evaluation over PROMISE12 dataset Mar 20, 2019
@shanpriya3
Copy link

I am facing the same issue with 3D Unet for two class segmentation. Could someone help? Thanks!

@shanpriya3
Copy link

Here is my configuration file:

############################ input configuration sections
[ct]
path_to_search = ./data/ct/
filename_contains = CT
spatial_window_size = (96, 96, 96)
pixdim = (1.0, 1.0, 1.0)
interp_order = 3
axcodes=(A, R, S)

[seg]
csv_file = ./models/unetH/segmentation_output/inference/inferred.csv
path_to_search = ./models/unetH/segmentation_output/inference
filename_contains = niftynet
spatial_window_size = (96, 96, 96)
interp_order = 0

[label]
path_to_search = ./data/ct/
filename_contains = Label
spatial_window_size = (96, 96, 96)
pixdim = (1.0, 1.0, 1.0)
interp_order = 0
axcodes=(A, R, S)

############################## system configuration sections
[SYSTEM]
cuda_devices = ""
num_threads = 2
num_gpus = 2
model_dir = models/unetH
queue_length = 36

[NETWORK]
name = unet
activation_function = prelu
batch_size = 1
decay = 0
reg_type = L2

volume level preprocessing

volume_padding_size = 44
histogram_ref_file = ./models/unetH/histogram_ref_file.txt
norm_type = percentile
cutoff = (0.01, 0.99)
normalisation=True
whitening = True
normalise_foreground_only=True
foreground_type = otsu_plus
window_sampling = uniform

[TRAINING]
sample_per_volume = 32
rotation_angle = (-10.0, 10.0)
scaling_percentage = (-10.0, 10.0)
random_flipping_axes=0,1
lr = 0.001
loss_type = Dice
starting_iter = 0
save_every_n = 500
max_iter = 1000
validation_every_n = 10
validation_max_iter = 5
exclude_fraction_for_validation = 0.2
exclude_fraction_for_inference = 0.1

[INFERENCE]
border = (44, 44, 44)
inference_iter = 1000
output_interp_order = 0
spatial_window_size = (96, 96, 96)
save_seg_dir = ./segmentation_output/inference
dataset_to_infer = inference

[EVALUATION]
save_csv_dir = ./data/evaluation
evaluations = dice,jaccard,false_positive_rate,positive_predictive_values,n_pos_ref,n_pos_seg
evaluation_units = foreground

############################ custom configuration sections
[SEGMENTATION]
image = ct
inferred = seg
label = label
label_normalisation = True
output_prob = False
num_classes = 2

Here is the error:

File "/anaconda3/envs/NiftyNet/lib/python3.6/site-packages/niftynet/evaluation/segmentation_evaluations.py", line 160, in metric_from_binarized
conf_mat = np.array([[np.sum(land(lnot(seg), lnot(ref))),
ValueError: operands could not be broadcast together with shapes (600,600,138,1,1) (546,546,338,1,1)

@shanpriya3
Copy link

Hello, Any update on this issue?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants