After reading and digesting Chapter 4 (link), I aggregated the following questions to test my comprehension. I’ll post the answer to the questions when I review them.

- Define the underflow and overflow problem.
- For example, how can you modify softmax to evade the underflow and overflow problem?
- Define condition number.
- Define poor conditioning.
- Define the function you are trying to optimize in a gradient based optimization.
- Define the following:
- critical points
- stationary points
- local maximum
- local minimum
- saddle points

- Define partial derivations and gradients.
- Define directiona derivatives.
- Define the Jacobian matrix.
- Define the Hessian matrix.
- Define issues with Hessian matrix with poor conditioning.
- Define first order optimization algorithms, second order optimization algorithms.
- Define Lipschitz constant and its significance.
- Define Convex optimization algorithms
- Define constrained optimization and 3 approaches you can solve it.
- Define Karush Kuhn Tucker (KKT).

### Like this:

Like Loading...