-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Distributed training raise an error #230
Comments
Thanks for the bug report. |
@Arthur151 I did try romp.train. Even with that I get the error. Here is what I get:
What is the reason? |
Oh, the command you use is different from what is on my rep.
Your command drops the Here is another way to achieve this, here is the format of command if you don't want to use the nohup
The key is to use the absolute path to train.py file. |
When I use
I get the following error: When I run with this command:
I get this error:
|
It seems that
|
@Arthur151 Ok replaced
|
This is pretty weird. You don't have this basic python package? |
@Arthur151 Ok I have |
Hi,
I am trying to run Romp in distributed mode. I follow this Script. Since there is no folder called
core
in the repository I replaced it withromp
. However, when I run the code it raises the error that there is no file calledtrain.py
. How can I avoid this error?Thanks
The text was updated successfully, but these errors were encountered: