This is a custom implementation of a logistic regression model in Python, created from scratch. The model uses gradient descent optimization to learn the optimal weights and bias for binary classification tasks. It also includes L2 regularization to prevent overfitting, with the regularization strength controlled by the lambda hyperparameter.
The model is trained using the fit() function, which takes in the training data X and corresponding labels y. The predict() function is used to make predictions on new data using the learned weights and bias.
This implementation is intended as a learning exercise for those interested in understanding the inner workings of logistic regression and gradient descent optimization. It also provides a basic starting point for those interested in building their own custom models.
Contributions to this project are welcome. If you find a bug or have a suggestion for improvement, please feel free to open an issue or submit a pull request.
This project is licensed under the MIT License. See the LICENSE
file for details.