Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[WIP] Fallback mechanism for mx.np operators #16923

Merged

Conversation

reminisce
Copy link
Contributor

Description

Fallback mechanism for mx.np operators.

@haojin2

@reminisce reminisce requested a review from szha as a code owner November 27, 2019 08:58
# try to fallback to official NumPy op
onp_op = _get_np_op(name)
new_inputs = [arg.asnumpy() if isinstance(arg, ndarray) else arg for arg in inputs]
out = onp_op(*new_inputs, **kwargs)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It will break the computational graph, and could not compute the gradient.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are aware of this. More sophisticated fallback mechanism is illustrated in #16698 by leveraging CustomOp. To reach 100% NumPy op coverage within a month, this the simplest and fastest pathway though. In the future, we will gradually replace those fallback ops with native implementation in backend.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It may be better to use mx.autograd.Function to wrap these numpy operators.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wkcn
mx.autograd.Function currently does not support DeepNumpy, some extra infrastructure is required.
Also, I am not sure if mx.autograd.Function could be integrated into HybridBlock, I cannot find corresponding cases covered in the unit tests.

@reminisce reminisce force-pushed the fallback_to_numpy_for_interoperability branch from fe362cc to 2cd2094 Compare January 16, 2020 18:47
Copy link
Contributor

@haojin2 haojin2 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@reminisce reminisce merged commit 1c24e74 into apache:master Jan 18, 2020
szhengac pushed a commit to szhengac/mxnet that referenced this pull request Jan 21, 2020
* Add fallback mechanism

Fix lint

Fix

* Add unit tests for linalg.cond and heaviside

* Add spacing

* Fix lint

* Skip python2 for dispatching array function

Co-authored-by: Hao Jin <[email protected]>
szhengac pushed a commit to szhengac/mxnet that referenced this pull request Jan 21, 2020
* Add fallback mechanism

Fix lint

Fix

* Add unit tests for linalg.cond and heaviside

* Add spacing

* Fix lint

* Skip python2 for dispatching array function

Co-authored-by: Hao Jin <[email protected]>
szhengac pushed a commit to szhengac/mxnet that referenced this pull request Jan 21, 2020
* Add fallback mechanism

Fix lint

Fix

* Add unit tests for linalg.cond and heaviside

* Add spacing

* Fix lint

* Skip python2 for dispatching array function

Co-authored-by: Hao Jin <[email protected]>
@Yiyan66 Yiyan66 mentioned this pull request Feb 17, 2020
7 tasks
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants