Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update #1

Merged
merged 155 commits into from
Feb 27, 2018
Merged

update #1

merged 155 commits into from
Feb 27, 2018

Conversation

cedias
Copy link
Owner

@cedias cedias commented Feb 27, 2018

No description provided.

bmccann and others added 30 commits February 7, 2017 07:56
Related to #72
Is this information available in docs?
* remove unused import statements

* remove unused variable and arguments
With flush, log info can appear immediately when it is directed to a pipe or file.
* commandline backwards compatible fix for models.py

* changes formatting to accomodate a 120 char width
desimone and others added 29 commits August 8, 2017 00:34
The batch size for the training set was erroneously used for instantiating the DataLoader class for the test set.
As is the final network output is modulated with a tanh nonlinearity.
This is undesirable. As a simple / realistic fix we add a final
linear layer.
* Fix indentation to be self-consistent

Replace 2-space with 4-space indentation

* Fix indentation to be self-consistent

Replace 2-space with 4-space indentation

* Fix indentation

Replace 3-space indentation with 4-space indentation
* Fix UserWarning

This fixes the following warning in mnist/main.py

src/torch_mnist.py:68: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.

Performance is unaffected.

* Fix UserWarning in mnist_hogwild

In this case, dim=1 because the input tensor x has ndim=2.

See _get_softmax_dim in https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py
* smooth_l1_loss now requires shapes to match
 * once scalars are enabled we must torch.stack() instead of
   torch.cat() a list of scalars
@cedias cedias merged commit 85c6197 into cedias:master Feb 27, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet