Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configuration files and hyperparameter tuning #53

Open
indweller opened this issue Oct 25, 2023 · 1 comment
Open

Configuration files and hyperparameter tuning #53

indweller opened this issue Oct 25, 2023 · 1 comment

Comments

@indweller
Copy link

I see that you have used Python classes for config files. Is there any reason you choose Python classes over YAML files?

Also, given that you used Python classes, how did you perform the grid search on the parameters? I found that the nested class structure makes it messier to iterate over and get the attributes of the parameters that I want to search over. If you have the code doing the grid search, can you please share that?

@nikitardn
Copy link
Collaborator

You can do a grid search by running the training in a for loop and changing the configs for each iteration. The only caveat is that you need to run each training in s separate process to ensure proper closing/reset of the simulator.
You will need the following:
`
from torch.multiprocessing import Process, set_start_method

try:
set_start_method("spawn")
except RuntimeError as e:
print(e)
....
def train(args: argparse.Namespace, env_cfg: BaseConfig, train_cfg: BaseConfig):
....
ppo_runner.learn()

def train_batch(args: argparse.Namespace):
for i in range(5):
# hyperparams to run over
seed = 23 * i + 17
train_cfg.seed = seed
env_cfg.seed = seed
# launch process
p = Process(target=train, args=(args, env_cfg, train_cfg))
p.start()
p.join()
p.kill()
print(f">>> Run {i} done!")
`

Alternatively you can use some external tool and go through yaml files using the update_class_from_dict and class_to_dict functions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants