Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Enable support for dense weight and sparse grad Adagrad updates #11355

Merged
merged 3 commits into from
Jun 25, 2018

Conversation

leezu
Copy link
Contributor

@leezu leezu commented Jun 21, 2018

Description

As title. While the kernel was in place, the operator raised LogUnimplementedOp if called with dense weight and sparse gradient.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to the relevant JIRA issue created (except PRs with tiny changes)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Nightly tests are added for complicated/long-running ones (e.g. changing distributed kvstore)
  • Build tests will be added for build configuration changes (e.g. adding a new build option with NCCL)
  • Code is well-documented:
  • For user-facing API changes, API doc string has been updated.
  • For new C++ functions in header files, their functionalities and arguments are documented.
  • For new examples, README.md is added to explain the what the example does, the source of the dataset, expected performance on test set and reference to the original paper if applicable
  • Check the API doc at http:https://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Enable dense weight, sparse grad updates with Adagrad

Comments

@eric-haibin-lin

@leezu leezu requested a review from szha as a code owner June 21, 2018 17:42
Copy link
Member

@eric-haibin-lin eric-haibin-lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I was just about to do this. Could you add a test in test_optimizer.py?

@leezu
Copy link
Contributor Author

leezu commented Jun 22, 2018

Added test.

@leezu
Copy link
Contributor Author

leezu commented Jun 22, 2018

Rebased to rerun tests

@szha
Copy link
Member

szha commented Jun 25, 2018

@eric-haibin-lin

@eric-haibin-lin eric-haibin-lin merged commit 9b27262 into apache:master Jun 25, 2018
@leezu leezu deleted the DnsRspDnsAdagrad branch June 25, 2018 16:46
XinYao1994 pushed a commit to XinYao1994/incubator-mxnet that referenced this pull request Aug 29, 2018
…he#11355)

* Support dense weight and sparse grad AdagradUpdate

* Simplify AdagradStorageType

* Add test
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants