This is an implementation of deep neural networks using nothing but Python and NumPy. I've taken up this project to complement the Deep Learning Specialization, offered by Coursera and taught by Andrew Ng.
Currently, the following things are supported:
- Layers:
DenseConv2DDepthwiseConv2DSeparableConv2DConv2DTransposeMaxPooling2DAveragePooling2DBatchNormDropoutFlattenAddConcatenate
- Activations:
LinearSigmoidTanhReLULeakyReLUELUSoftmax
- Losses
BinaryCrossEntropyCategoricalCrossEntropyMeanSquaredError
- Optimizers:
- Vanilla
SGD SGDwith momentumRMSProp- Vanilla
Adam Adamwith AMSGrad.
- Vanilla
- Learning Rate Decay
TimeDecayExponentialDecayCosineDecay
It is also possible to easily add layers, activations, losses, optimizers and decay algorithms.
Note: There is no automatic differentiation. Users, when extending, need to define the necessary derivatives for backpropagation.
Hope you like it! Happy learning!