You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
That would mimic what ADModeDerivative did previously, but in a separated code path, taking advantage of the flexibility of the Tensor class.
Until this is done, we have the slow derivative, computed from delta expressions in the main code path. IIRC the speed difference wasn't that big. Probably the same tensor operations are performed, so the big cost is only the delta expressions taking up RAM and cache.
The text was updated successfully, but these errors were encountered:
That would mimic what ADModeDerivative did previously, but in a separated code path, taking advantage of the flexibility of the Tensor class.
Until this is done, we have the slow derivative, computed from delta expressions in the main code path. IIRC the speed difference wasn't that big. Probably the same tensor operations are performed, so the big cost is only the delta expressions taking up RAM and cache.
The text was updated successfully, but these errors were encountered: