Skip to content

Commit

Permalink
[RLlib] Make checkpointing test have multiple nodes, make node for dq…
Browse files Browse the repository at this point in the history
…n test larger (ray-project#37127)

Signed-off-by: Avnish <[email protected]>
  • Loading branch information
avnishn committed Jul 6, 2023
1 parent 8f17b73 commit 509184a
Show file tree
Hide file tree
Showing 4 changed files with 22 additions and 5 deletions.
2 changes: 1 addition & 1 deletion release/release_tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4190,7 +4190,7 @@
team: rllib
cluster:
cluster_env: app_config.yaml
cluster_compute: 1gpu_16cpus.yaml
cluster_compute: 1gpu_32cpus.yaml

run:
timeout: 18000
Expand Down
17 changes: 17 additions & 0 deletions release/rllib_tests/1gpu_32cpus.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
cloud_id: {{env["ANYSCALE_CLOUD_ID"]}}
region: us-west-2

max_workers: 0

head_node_type:
name: head_node
instance_type: g5.8xlarge

worker_node_types: []

aws:
BlockDeviceMappings:
- DeviceName: /dev/sda1
Ebs:
DeleteOnTermination: true
VolumeSize: 500
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ head_node_type:

worker_node_types:
- name: worker_node
instance_type: g3.8xlarge
min_workers: 1
max_workers: 1
instance_type: g3s.xlarge
min_workers: 2
max_workers: 2
use_spot: false

aws:
Expand Down
2 changes: 1 addition & 1 deletion rllib/core/learner/torch/torch_learner.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ def __init__(
# Will be set during build.
self._device = None

# Whether to compile the RL Module of this learner. This implies that the
# Whether to compile the RL Module of this learner. This implies that the.
# forward_train method of the RL Module will be compiled. Further more,
# other forward methods of the RL Module will be compiled on demand.
# This is assumed to not happen, since other forwrad methods are not expected
Expand Down

0 comments on commit 509184a

Please sign in to comment.