Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add Dropout to MLP module #988

Merged
merged 2 commits into from
Mar 24, 2023
Merged

[Feature] Add Dropout to MLP module #988

merged 2 commits into from
Mar 24, 2023

Conversation

BY571
Copy link
Contributor

@BY571 BY571 commented Mar 24, 2023

Description

Adds the option to use dropout in the MLP layer module and changes the order when normalization classes are applied.

Motivation and Context

The option to add dropout to the MLP module wasn't there yet. I also changed the order of when normalization classes are applied. It is now done directly after the layer and before the activation function as done in the batch norm paper.

These changes open up the opportunity to create DroQ algorithm which is an improvement over REDQ and applies Dropout + LayerNorm in the Critic networks which makes it possible to have a high update-to-data ratio without the need of big ensemble sizes. Thus, DroQ is sample efficient as REDQ and computationally efficient as SAC which makes it very interesting for application in the real world e.g. robots.

image

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of examples)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 24, 2023
Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

# Conflicts:
#	torchrl/modules/models/models.py
@vmoens vmoens merged commit 0d1823c into pytorch:main Mar 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants