-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
关于自制kitti数据集训练的问题 #150
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
您好,我是一名这方面的小白,在尝试用自己制作的kitti数据集进行训练时出现了以下问题:
/root/miniconda3/envs/a/lib/python3.7/site-packages/torch/optim/lr_scheduler.py:136: UserWarning: Detected call of
lr_scheduler.step()
beforeoptimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order:optimizer.step()
beforelr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate"https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning)
Traceback (most recent call last):
File "main.py", line 69, in
main()
File "main.py", line 65, in main
run_net(args, config, train_writer, val_writer)
File "/root/autodl-tmp/pt/tools/runner.py", line 166, in run_net
train_writer.add_scalar('Loss/Epoch/Sparse', losses.avg(0), epoch)
File "/root/autodl-tmp/pt/utils/AverageMeter.py", line 42, in avg
return self._sum[idx] / self._count[idx]
ZeroDivisionError: division by zero
我看到在另一个相同的问题中您提到修改bs,请问这里的bs指的是配置文件中的total_bs吗?如果是的话,我修改成4之后还是会报相同的错误,请问能更详细的解释一下吗?
The text was updated successfully, but these errors were encountered: