Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3 test failling #292

Closed
romainVala opened this issue Aug 26, 2020 · 5 comments
Closed

3 test failling #292

romainVala opened this issue Aug 26, 2020 · 5 comments
Labels
bug Something isn't working

Comments

@romainVala
Copy link
Contributor

🐛Bug

After un update of my python package I now get error on the following test

FAIL: test_with_zero_intensity (tests.transforms.augmentation.test_random_spike.TestRandomSpike)
FAIL: test_with_zero_spike (tests.transforms.augmentation.test_random_spike.TestRandomSpike)
FAIL: test_no_movement (tests.transforms.augmentation.test_random_motion.TestRandomMotion)

To reproduce

numpy 1.19.1 pypi_0 pypi
numpy-base 1.17.2 py37hde5b4d6_0
numpydoc 1.1.0 py_0

# Your code here
array=np.random.uniform(0,40,100)
transformed = np.fft.fftn(array)
fshift = np.fft.fftshift(transformed)
f_ishift = np.fft.ifftshift(fshift)
img_back = np.fft.ifftn(f_ishift)
np.max(array-np.abs(img_back))
# Your errors here
 1.4210854715202004e-14

Expected behavior

it is related to fft and fft inverse that now to not get the exact same number

Solution will be to replace
assert_array_equal with des np.all_close

but may be it is due to something wrong in my python package ...

TorchIO version
0.17.34

@romainVala romainVala added the bug Something isn't working label Aug 26, 2020
@GReguig
Copy link
Contributor

GReguig commented Aug 30, 2020

Hello,

If I may add a similar bug that sometimes happens when running the tests with pytest:

FAILED transforms/test_transforms.py::TestTransforms::test_transforms_array

Coming from the bias field generation transform when a division by 0 occurs

@staticmethod
    def generate_bias_field(
            data: TypeData,
            order: int,
            coefficients: TypeData,
            ) -> np.ndarray:
        # Create the bias field map using a linear combination of polynomial
        # functions and the coefficients previously sampled
        shape = np.array(data.shape[1:])  # first axis is channels
        half_shape = shape / 2
    
        ranges = [np.arange(-n, n) for n in half_shape]
    
        bias_field = np.zeros(shape)
        x_mesh, y_mesh, z_mesh = np.asarray(np.meshgrid(*ranges))
    
>       x_mesh /= x_mesh.max()
E       FloatingPointError: divide by zero encountered in true_divide

../torchio/transforms/augmentation/intensity/random_bias_field.py:94: FloatingPointError

@fepegar
Copy link
Owner

fepegar commented Sep 16, 2020

Thank you both for reporting.


@romainVala

I agree that assert_array_almost_equal should be used for these type of tests. However, I think a more efficient solution for us would be to leave the input as it was for the cases tested in those tests. What do you and @GFabien think?


@GReguig

I think this is unrelated. It looks like RandomDownsample is making some dimensions 1 and RandomBiasField is not happy. I'll open a new issue and investigate. What version of TorchIO are you using?

fepegar added a commit that referenced this issue Sep 16, 2020
@GFabien
Copy link
Contributor

GFabien commented Sep 17, 2020

I agree that assert_array_almost_equal should be used for these type of tests. However, I think a more efficient solution for us would be to leave the input as it was for the cases tested in those tests. What do you and @GFabien think?

Actually I'm not sure about that. In one hand, it's more effective and output will truly equal input if we leave the input untouched when transforms are supposed to have no effect. On the other hand, it shows that the transform somehow behaves properly by not altering the input (or almost) when it is supposed to have no effect. What do you think?

@fepegar
Copy link
Owner

fepegar commented Sep 17, 2020

I think you are right. What's the point of using 0 intensity or 0 spikes anyway? I will just switch to assert_array_almost_equal in the tests.

@fepegar
Copy link
Owner

fepegar commented Sep 17, 2020

Fixed in v0.17.35.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants