Skip to content

My Numpy implementation of Popular Algorithms and Architecture of Deep Neural Networks.

Notifications You must be signed in to change notification settings

PrakarshBhardwaj/Deep-Neural-Networks-in-Numpy

Repository files navigation

Numpy implementation of Popular Algorithms and Architecture of Deep Neural Networks.

Initilization Methods -

  1. Random initilization
  2. He initilization

Forward Propogation -

  1. Forward Propogation without Batch Normalization
  2. Forward Propogation with Batch Normalization

Activation Functions -

  1. Sigmoid
  2. ReLU
  3. Leaky-ReLU

Regularization -

  1. L-2 regularization
  2. Dropout regularization

Backpropogation -

  1. Backpropogation without Batch Normalization
  2. Backpropogation with Batch Normalization

Cost -

  1. Negative Log Likelihood Cost function.

Debugging -

  1. Gradient Checking

About

My Numpy implementation of Popular Algorithms and Architecture of Deep Neural Networks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages