-
Notifications
You must be signed in to change notification settings - Fork 22.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slicing tensors with negative steps #604
Comments
Allowing negative strides is already being tracked in #229 |
I agree, but our C backends will require a few, possibly nontrivial, changes. I'm closing this since it's a duplicate. |
Oops I didn't look carefully enough apparently. Apologies. |
@apaszke I think this issue might deserve to be reopened. |
@el3ment while what you say is absolutely correct, I dont think we are planning to support negative striding anytime in the near / far future (if ever). It requires rethinking our internals and is a huge undertaking, for what we see as small perceived benefit. cc: @ezyang to double-check that our new C10 Tensor design is not incorporating negative strides. |
We were originally thinking that when we looked into changing TH's size/stride semantics, it might be "easy" to also make things work with negative strides, but we ended up using all our engineering budget just on making zero-size dims and scalars work correctly. |
Update index.rst - Adds torchscript to left nav
To support fusion with linear layer, we did: new operator add_optional that supports add with optional[Tensor]; decompose pass that breaks linear layer into matmul and add_optional; parser rule added for add_optional and linear; linear added in autodiff; python test;
It'd be nice to be able to slice tensors using a negative step such as tensor[::-1] or tenso[:, ::-1, :] like what is possible with numpy or theano. It'll make implementing things like bidirectional RNNs (without using the inbuilt RNN modules) simpler.
The text was updated successfully, but these errors were encountered: