Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Gradient multiplier (contrib) operator #13632

Merged
merged 14 commits into from
Jan 24, 2019

Conversation

ifeherva
Copy link
Contributor

@ifeherva ifeherva commented Dec 13, 2018

Description

Adds the gradient multiplier operator that is mostly used in unsupervised adversarial domain adaptation.
In short: on forward pass it acts as identity transform; on backwards it multiplies the gradients with a scalar constant (lambda).
See full description here: http:https://proceedings.mlr.press/v37/ganin15.pdf

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Nightly tests are added for complicated/long-running ones (e.g. changing distributed kvstore)
  • Build tests will be added for build configuration changes (e.g. adding a new build option with NCCL)
  • [x Code is well-documented:
  • For user-facing API changes, API doc string has been updated.
  • For new C++ functions in header files, their functionalities and arguments are documented.
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Missing test for backwards pass
@ifeherva ifeherva changed the title [WIP] Gradient reversal (contrib) operator Gradient reversal (contrib) operator Dec 13, 2018
@roywei
Copy link
Member

roywei commented Dec 14, 2018

@mxnet-label-bot add[Operator, pr-awaiting-review]

@marcoabreu marcoabreu added Operator pr-awaiting-review PR is waiting for code review labels Dec 14, 2018
@ThomasDelteil
Copy link
Contributor

Shouldn't we have a more generic gradient multiplier operator? What d you think?

@ifeherva
Copy link
Contributor Author

That is certainly possible, shall I rewrite it?

@ThomasDelteil
Copy link
Contributor

@szha @zheng-da what do you think?

@Roshrini
Copy link
Member

Roshrini commented Jan 3, 2019

@szha @zheng-da Can you take a look at this PR? Thanks!

Copy link
Member

@szha szha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for contributing the op. The forward and backward logic can utilize existing kernels such as those in identity and broadcast_scalar_mul.

@ifeherva
Copy link
Contributor Author

@szha Thanks for the feedback, good points. However, I have a hard time finding those kernels, to me they seem to be deeply integrated into other operators. Could you please point me to the right functions?

@ifeherva
Copy link
Contributor Author

@szha Dumped the header file and used forward and backward from identity / scalar_mul.

.set_attr_parser([](NodeAttrs* attrs) {
attrs->parsed = std::stod(attrs->dict["scalar"]);
})
.set_attr<FInferStorageType>("FInferStorageType", ElemwiseStorageType<1, 1, false, true, true>)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you also plan to support sparse inputs/outputs? If not, you don't have to register FInferStorageType and FComputeEx (by default it infers dense storage and uses FCompute).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the operator is very simple I thought it would be easy to support sparse data as well. What do I need to change to have full support?

@ifeherva
Copy link
Contributor Author

Thinking to rename the operator to gradient multiplier. Any thoughts?

DispatchMode* dispatch_mode,
std::vector<int> *in_attrs,
std::vector<int> *out_attrs) {
CHECK_EQ(in_attrs->size(), 1);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method has no indentation. Is this expected?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does, not sure why github shows it wrong

@ifeherva ifeherva changed the title Gradient reversal (contrib) operator Gradient multiplier (contrib) operator Jan 14, 2019
Retrigger flaky test
[](const NodeAttrs& attrs){
return std::vector<bool>{true};
})
.add_argument("scalar", "float", "scalar input");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider making this description more informative (e.g. X multiplier)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, updated.

Improved the description of the scalar multiplier
@lanking520
Copy link
Member

@szha @ThomasDelteil merge?

@ThomasDelteil ThomasDelteil merged commit 183be8c into apache:master Jan 24, 2019
jessr92 pushed a commit to jessr92/incubator-mxnet that referenced this pull request Jan 27, 2019
* Added the gradient reversal contrib operator

Missing test for backwards pass

* Fixed linting errors

* Fixed forward test

* Added random forward / backward test for gradient reversal

* Update test_contrib_operator.py

* Fixed typo in gradient reversal op description

* Replace forward code with the identitiy implementation

* Fixed typos in function docs

* Changed default behavior to identity

* Replaced backward code with scalar_mul

* Fixed backward operator and unit test

* Renamed operator to gradient multiplier

* Update test_contrib_operator.py

Retrigger flaky test

* Update gradient_multiplier_op.cc

Improved the description of the scalar multiplier
@ifeherva ifeherva deleted the gradient_reversal_operator branch February 10, 2019 04:44
stephenrawls pushed a commit to stephenrawls/incubator-mxnet that referenced this pull request Feb 16, 2019
* Added the gradient reversal contrib operator

Missing test for backwards pass

* Fixed linting errors

* Fixed forward test

* Added random forward / backward test for gradient reversal

* Update test_contrib_operator.py

* Fixed typo in gradient reversal op description

* Replace forward code with the identitiy implementation

* Fixed typos in function docs

* Changed default behavior to identity

* Replaced backward code with scalar_mul

* Fixed backward operator and unit test

* Renamed operator to gradient multiplier

* Update test_contrib_operator.py

Retrigger flaky test

* Update gradient_multiplier_op.cc

Improved the description of the scalar multiplier
haohuanw pushed a commit to haohuanw/incubator-mxnet that referenced this pull request Jun 23, 2019
* Added the gradient reversal contrib operator

Missing test for backwards pass

* Fixed linting errors

* Fixed forward test

* Added random forward / backward test for gradient reversal

* Update test_contrib_operator.py

* Fixed typo in gradient reversal op description

* Replace forward code with the identitiy implementation

* Fixed typos in function docs

* Changed default behavior to identity

* Replaced backward code with scalar_mul

* Fixed backward operator and unit test

* Renamed operator to gradient multiplier

* Update test_contrib_operator.py

Retrigger flaky test

* Update gradient_multiplier_op.cc

Improved the description of the scalar multiplier
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Operator pr-awaiting-review PR is waiting for code review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

10 participants