Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sourcery refactored master branch #2

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

sourcery-ai[bot]
Copy link

@sourcery-ai sourcery-ai bot commented Nov 8, 2022

Branch master refactored by Sourcery.

If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.

See our documentation here.

Run Sourcery locally

Reduce the feedback loop during development by using the Sourcery editor plugin:

Review changes via command line

To manually merge these changes, make sure you're on the master branch, then run:

git fetch origin sourcery/master
git merge --ff-only FETCH_HEAD
git reset HEAD^

Help us improve this pull request!

@sourcery-ai sourcery-ai bot requested a review from soumickmj November 8, 2022 08:29
Copy link
Author

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to GitHub API limits, only the first 60 comments can be shown.

@@ -5,6 +5,7 @@
LICENSE file in the root directory of this source tree.
"""

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 19-20 refactored with the following changes:

Comment on lines -63 to +66
logging.info("Using CUDA: {} CUDA AVAIL: {} #DEVICES: {} VERSION: {}".format(
args.cuda, torch.cuda.is_available(), torch.cuda.device_count(),
torch.version.cuda))
logging.info(
f"Using CUDA: {args.cuda} CUDA AVAIL: {torch.cuda.is_available()} #DEVICES: {torch.cuda.device_count()} VERSION: {torch.version.cuda}"
)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BaseTrainer.initial_setup refactored with the following changes:

Comment on lines -111 to +112
indices = range(0, ndev)
indices = range(ndev)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BaseTrainer.data_setup refactored with the following changes:

Comment on lines -153 to +154
logging.info("Train Loader created, batches: {}".format(self.nbatches))
logging.info(f"Train Loader created, batches: {self.nbatches}")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BaseTrainer.loader_setup refactored with the following changes:

Comment on lines -170 to +180
self.runinfo = {}
self.runinfo["args"] = args
self.runinfo["at_epoch"] = 0
self.runinfo["seed"] = args.seed
self.runinfo["best_dev_loss"] = 1e9
self.runinfo["epoch"] = []
self.runinfo["train_losses"] = []
self.runinfo["train_fnames"] = []
self.runinfo["dev_losses"] = []
self.runinfo = {
"args": args,
"at_epoch": 0,
"seed": args.seed,
"best_dev_loss": 1000000000.0,
"epoch": [],
"train_losses": [],
"train_fnames": [],
"dev_losses": [],
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function BaseTrainer.runinfo_setup refactored with the following changes:

Comment on lines -91 to +99
group_idx = 0
nlayers = 0
for param in model.parameters():
group_size = 1
for g in param.size():
group_size *= g
nparams += group_size
group_idx += 1
if len(param.shape) >= 2:
nlayers += 1

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function LoggingMixin.count_parameters refactored with the following changes:

@@ -5,14 +5,15 @@
LICENSE file in the root directory of this source tree.
"""

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 15-15 refactored with the following changes:

@@ -5,6 +5,7 @@
LICENSE file in the root directory of this source tree.
"""

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 19-19 refactored with the following changes:

Comment on lines -104 to +111
print(f"Received USR1 signal in spawn_dist", flush=True)
print("Received USR1 signal in spawn_dist", flush=True)
for i, p in enumerate(processses):
if p.is_alive():
os.kill(p.pid, signal.SIGUSR1)

def forward_term_signal(signum, frame):
print(f"Received SIGTERM signal in spawn_dist", flush=True)
print("Received SIGTERM signal in spawn_dist", flush=True)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function run refactored with the following changes:

Comment on lines -164 to +165
losses['instantaneous_' + name] = loss_dict[name]
losses['average_' + name] = avg_losses[name]
losses[f'instantaneous_{name}'] = loss_dict[name]
losses[f'average_{name}'] = avg_losses[name]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function TrainingLoopMixin.run refactored with the following changes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
0 participants