This repo contains implementations of popular machine learning algorithms and some techniques that come along with them:
- Logistic Regression with regularization
- Multilayer Neural network with:
- supported layer activations: ReLU, sigmoid, softmax
- optimizers: gradient descent, GD with momentum, RMSProp, Adam
- mini-batch training
- adjustable number of hidden layers
A few IPython notebooks illustrate the results that these implementations can achieve and compares them with similar implementations in the Keras library.
- Dropout in the neural network
- Batch normalization in the neural network
- Tanh activation
- ConvNet implementation
- Some sequence models implementaions
- More notebooks experimenting with models implemented using Keras and TensorFlow.