[BugFix] Defaulting passing_devices
to None
#495
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Motivation and Context
It fixes the open issue #433
Currently, if the user does not specify the "passing_device" or "passing_devices" parameter, the collector class assumes the output data will be stored in the CPU RAM by default. However, this may cause confusion because in the case that the user specifies the model to be run in the GPU via the "policy", "device" or "devices" parameter, he may not aware that the output data will actually be stored in the CPU RAM instead of the GPU RAM. I solve this possible confusion by changing the default RAM location of the output data to be same as the "policy", "device" or "devices" parameter's RAM location, if the user does NOT specify the "passing_device" or "passing_devices" parameter. However, the output RAM location will follow the "passing_device" or "passing_devices" parameter if the user DOES specify.
Types of changes
What types of changes does your code introduce? Remove all that do not apply:
Checklist
Go over all the following points, and put an
x
in all the boxes that apply.If you are unsure about any of these, don't hesitate to ask. We are here to help!