Deep Learning Chapter 6 Deep Feedforward Networks

Brief outline below. Read from link.

  1. Cost Function
    1. Maximum Log Likelihood (cross entropy)
    2. Minimum Square Error
    3. Minimum Absolute Error
  2. Output Units
    1. Linear
    2. Sigmoid + Maximum Log Likelihood
    3. Softmax + Maximum Log Likelihood (multivariate)
    4. Gaussian Mixture
  3. Hidden Units
    1. Rectified Linear Unit
      1. Absolute Value Rectification
      2. leaky ReLU
      3. parametric ReLU (PReLU)
      4. Maxout units
    2.  Sigmoid Units
      1. Logistic Sigmoid
      2. Tanh
    3.  Others
      1. None
      2. Softmax
      3. RBF
      4. Softplus
      5. Hard tanh
  4. Architecture Design
    1. Depth vs Width (exponential)
    2. Connection between layers
  5. Back Propagation
    1. Might need to implement one myself to truly understand this
  6. History
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s