Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seeding in Transform, make the same transform all the time #191

Closed
romainVala opened this issue Jun 15, 2020 · 6 comments · Fixed by #353
Closed

Seeding in Transform, make the same transform all the time #191

romainVala opened this issue Jun 15, 2020 · 6 comments · Fixed by #353
Labels
documentation Improvements or additions to documentation

Comments

@romainVala
Copy link
Contributor

🐛Bug
I notice it at least for RandomAffine RandomElastic and RandomNoise

To reproduce

t = Compose([RandomNoise(seed=10)])
dataset = ImagesDataset(suj, transform=t); 

sample=dataset[0]
for i in range(0, 10):
    sample = dataset[0]
    h=sample.history
    print(h[0][1]['image']['std'])

Now change seed=10 to seed=None and you get a diferent number each time

I thing it is due to this line

it should not be called here, juste once in the init()

@romainVala romainVala added the bug Something isn't working label Jun 15, 2020
@romainVala
Copy link
Contributor Author

may be it was intended so, and this is my understanding of seeding which is different ...

I thought it would produce a different number each call, (but If I run it again, I would get the same number)

@fepegar
Copy link
Owner

fepegar commented Jun 15, 2020

Hi @romainVala,

This point of setting the seed is to get the same results every time. Maybe a better name for the argument would be random_state: https://scikit-learn.org/stable/glossary.html#term-random-state.

@fepegar
Copy link
Owner

fepegar commented Jun 15, 2020

Here's some more information about reproducibility in PyTorch: https://pytorch.org/docs/stable/notes/randomness.html

@romainVala
Copy link
Contributor Author

hmm, but if you get the same results for each call, it is no more a random transform ...
my understanding of reproducibility is to get the same sequence of random transform non ?

@fepegar
Copy link
Owner

fepegar commented Jun 15, 2020

AFAIK random numbers in the libraries we use are generated using pseudorandom number generators. I suppose they're actually pseudorandom transforms :)

In [1]: import torch

In [3]: torch.rand(1)
Out[3]: tensor([0.0833])

In [4]: torch.rand(1)
Out[4]: tensor([0.7058])

In [5]: torch.rand(1)
Out[5]: tensor([0.8763])

In [6]: torch.manual_seed(42)
Out[6]: <torch._C.Generator at 0x7f01b881f3d0>

In [7]: torch.rand(1)
Out[7]: tensor([0.8823])

In [8]: torch.rand(1)
Out[8]: tensor([0.9150])

In [9]: torch.rand(1)
Out[9]: tensor([0.3829])

In [10]: torch.manual_seed(42)
Out[10]: <torch._C.Generator at 0x7f01b881f3d0>

In [11]: torch.rand(1)
Out[11]: tensor([0.8823])

In [12]: torch.rand(1)
Out[12]: tensor([0.9150])

In [13]: torch.rand(1)
Out[13]: tensor([0.3829])

You can set a global (torch) seed once at the beginning of your training to make your runs reproducible. If you want, for some reason, to generate the same transform with the same parameters twice, you can use the seed kwarg of the transform.

In this answer there is some more info.

@romainVala
Copy link
Contributor Author

ok, this was then my understanding of seed keyword in the transform that confuse me (you should specify more this behavior in the doc )

I do not really see any case where this may be useful ...
(I had a use case where I want a specific transform like a rotation of 10, then I use RandomAffine with degree=(10,10) for that ... )

Anyway as you want:
I just need to remove seed argument, for all my transform (and redo the training from this WE ... )

thanks

@fepegar fepegar added documentation Improvements or additions to documentation and removed bug Something isn't working labels Jun 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
2 participants