-
Omdena
- Surat, Gujarat
-
19:23
(UTC +05:30) - in/manav-desai-11809621a
- @manav170303
- https://www.kaggle.com/manav1703
Highlights
- Pro
-
-
-
DL_04_what_is_Perceptron Public
Hello, this is day_04 of Deep Learning journey, and this contains what is Perceptron?
Jupyter Notebook UpdatedAug 24, 2024 -
Jupyter Notebook Apache License 2.0 Updated
Aug 18, 2024 -
DL_31_Batch_Normalization Public
How Batch Normalization transforms the internal workings of neural networks by normalizing inputs within each mini-batch. By maintaining stable activations throughout the training process, Batch No…
Jupyter Notebook Apache License 2.0 UpdatedAug 17, 2024 -
1. Weight initialization techniques in neural networks 2. Xavier initialization in neural network 3. He initialization in neural network
Jupyter Notebook MIT License UpdatedAug 15, 2024 -
DL_29_weights-initialization Public
I'll guide you through weight initialization techniques in neural networks and highlight what NOT to do. From common mistakes to misconceptions, I'll help you navigate the dos and don'ts in optimiz…
Jupyter Notebook Apache License 2.0 UpdatedAug 15, 2024 -
DL_26_Regularization Public
Regularization is a set of techniques that can prevent overfitting in neural networks and thus improve the accuracy of a Deep Learning model when facing completely new data from the problem domain.
Jupyter Notebook Apache License 2.0 UpdatedAug 9, 2024 -
DL_25_Dropout Public
Explore the power of Dropout Layers with code examples for both Regression and Classification tasks.
Jupyter Notebook Apache License 2.0 UpdatedAug 9, 2024 -
DL_23_Data_scaling Public
Data scaling is a recommended pre-processing step when working with deep learning neural networks. Data scaling can be achieved by normalizing or standardizing real-valued input and output variables.
Jupyter Notebook Apache License 2.0 UpdatedAug 9, 2024 -
DL_20_vanishing_gradient Public
Vanishing and Exploding Gradient Problems in Artificial Neural Networks (ANNs) with practical code examples. Understand the challenges and solutions for training deep networks effectively. Improve …
Jupyter Notebook Apache License 2.0 UpdatedAug 9, 2024 -
DL_22_early_stopping Public
Early stopping is a method in Deep Learning that allows you to specify an arbitrarily large number of training epochs and stop training once the model performance stops improving on the validation …
Jupyter Notebook Apache License 2.0 UpdatedAug 9, 2024 -
DL_19_Gradient_Descent_in_NN Public
Batch, Stochastic, and Mini-Batch methods
Jupyter Notebook Apache License 2.0 UpdatedAug 6, 2024 -
value_maven Public
A laptop price predictor, using Ensemble techniques
Jupyter Notebook Apache License 2.0 UpdatedAug 1, 2024 -
https://www.kaggle.com/code/manav1703/gre-admission-prediction
Apache License 2.0 UpdatedJul 31, 2024 -
Customer Churn Prediction using ANN
Jupyter Notebook Apache License 2.0 UpdatedJul 31, 2024 -
Handwritten Digit Classification using ANN
Jupyter Notebook UpdatedJul 31, 2024 -
perceptron_hinge_loss_fxn_gradient_descent
Jupyter Notebook UpdatedJul 25, 2024 -
perceptron_trick_from_scratch
Jupyter Notebook UpdatedJul 25, 2024 -
batch gradient descent from scratch for multiple linear regression
Jupyter Notebook MIT License UpdatedJul 3, 2024 -
-
-
-
-
-
-
working-with-SQL-in-Pandas Public
working with SQL in Pandas
Jupyter Notebook Apache License 2.0 UpdatedJun 7, 2024 -
pandas-on-IPL-ds Public
ipl analysis with use of pandas
Jupyter Notebook Apache License 2.0 UpdatedJun 5, 2024 -
-
K-means_clustering Public
K-means clustering is a widely used unsupervised learning algorithm that partitions a set of objects into a predetermined number of clusters. The goal is to minimize the sum of the squared distance…
Jupyter Notebook MIT License UpdatedMay 7, 2024