Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Implement remaining nn_activation ops in opperf #17475

Merged
merged 7 commits into from
Feb 12, 2020

Conversation

connorgoggins
Copy link
Contributor

@connorgoggins connorgoggins commented Jan 29, 2020

Description

This PR serves to implement the remaining operators from the nn_activation category in opperf. To achieve this, I refactored the preexisting individual run_performance_test calls (for the four operators that had already been implemented) into a single generalized function call to run_op_benchmarks. I also implemented Softmax, SoftmaxActivation, softmin, and Activation ops, which are also called via the run_op_benchmarks function.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented
  • To the best of my knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • M benchmark/opperf/nd_operations/nn_activation_operators.py
  • M benchmark/opperf/rules/default_params.py
  • M benchmark/opperf/utils/op_registry_utils.py

Comments

Tested on c5.18xl & p2.16xl w/ubuntu 16.04 and Mac OS with:

  1. Checkout branch and call function run_activation_operators_benchmarks - runs all activation ops on default data
  2. Checkout branch and run opperf.py (full run of all ops)

Copy link
Contributor

@ChaiBapchya ChaiBapchya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM but pending the results

  • All Opperf
  • Category
    CPU & GPU

Thanks!

@connorgoggins
Copy link
Contributor Author

connorgoggins commented Jan 31, 2020

@connorgoggins
Copy link
Contributor Author

@connorgoggins
Copy link
Contributor Author

@mxnet-label-bot add [pr-awaiting-review]

@ChaiBapchya
Copy link
Contributor

Incorrect way to rebase @connorgoggins

First get your master branch updated

git remote add upstream https://github.com/apache/incubator-mxnet
git checkout master
git fetch upstream master
git merge upstream/master
git push origin master

Once your fork master is sync'ed with remote master, rebase your branch on master

git checkout <branchname>
git rebase master
git push origin <branchname>

Let me know if this works!

@connorgoggins connorgoggins force-pushed the implement_nn_activation_ops branch 3 times, most recently from a21c378 to f11ba22 Compare February 5, 2020 19:58
Copy link
Contributor

@access2rohit access2rohit left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Retriggered the failing builds

@apeforest apeforest merged commit 0779304 into apache:master Feb 12, 2020
zheyuye pushed a commit to zheyuye/incubator-mxnet that referenced this pull request Feb 19, 2020
* Refactored individual run_performance_test calls into single generalized function, added Softmax, SoftmaxActivation, softmin, and Activation ops

* Refactored individual run_performance_test calls into single generalized function, added Softmax, SoftmaxActivation, softmin, and Activation ops

* Fixed variable names

* Added newline at end of file for consistency

* Addressed NN Basic PR comment

* Dropped unnecessary custom_args

* Removed unique_ops
anirudh2290 pushed a commit to anirudh2290/mxnet that referenced this pull request May 29, 2020
* Refactored individual run_performance_test calls into single generalized function, added Softmax, SoftmaxActivation, softmin, and Activation ops

* Refactored individual run_performance_test calls into single generalized function, added Softmax, SoftmaxActivation, softmin, and Activation ops

* Fixed variable names

* Added newline at end of file for consistency

* Addressed NN Basic PR comment

* Dropped unnecessary custom_args

* Removed unique_ops
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
pr-awaiting-review PR is waiting for code review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants