Deep Learning Chapter 4 Numerical Computation Questions

After reading and digesting Chapter 4 (link), I aggregated the following questions to test my comprehension. I’ll post the answer to the questions when I review them.

  1. Define the underflow and overflow problem.
  2. For example, how can you modify softmax to evade the underflow and overflow problem?
  3. Define condition number.
  4. Define poor conditioning.
  5. Define the function you are trying to optimize in a gradient based optimization.
  6. Define the following:
    1. critical points
    2. stationary points
    3. local maximum
    4. local minimum
    5. saddle points
  7. Define partial derivations and gradients.
  8. Define directiona derivatives.
  9. Define the Jacobian matrix.
  10. Define the Hessian matrix.
  11. Define issues with Hessian matrix with poor conditioning.
  12. Define first order optimization algorithms, second order optimization algorithms.
  13. Define Lipschitz constant and its significance.
  14. Define Convex optimization algorithms
  15. Define constrained optimization and 3 approaches you can solve it.
  16. Define Karush Kuhn Tucker (KKT).

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s