bhattbhavesh91 / why-is-relu-non-linear Sponsor Star 2 Code Issues Pull requests A small walk-through to show why ReLU is non linear! deep-learning neural-networks relu relu-layer activation-functions neural-networks-and-deep-learning activation-function relu-derivative Updated May 28, 2021 Jupyter Notebook
aex-nirvael / ReLU Star 0 Code Issues Pull requests Backward pass of ReLU activation function for a neural network. relu relu-layer relu-derivative relu-activation Updated Jan 11, 2020 Python
juliusberner / regularity_relu_network Star 0 Code Issues Pull requests Towards a regularity theory for ReLU networks (construction of approximating networks, ReLU derivative at zero, theory) deep-neural-networks research approximation sobolev-space-norm relu-derivative regularity-theory Updated Oct 15, 2019 Jupyter Notebook
mansi-k / BackPropagation Star 0 Code Issues Pull requests Implemented back-propagation algorithm on a neural network from scratch using Tanh and ReLU derivatives and performed experiments for learning purpose neural-network pytorch derivative backpropagation scratch-implementation relu-derivative wine-quality-prediction tanh-derivative Updated Aug 17, 2021 Jupyter Notebook