Explain the necessary and sufficient conditions in case of unconstrained optimisation

In unconstrained optimization, necessary conditions for a local minimum or maximum involve critical points where the derivative of the objective function is zero.

Get the full solved assignment PDF of MEC-203 of 2023-24 session now.

This is expressed through the first-order condition, stating that the gradient must be equal to zero at such points.

Sufficient conditions involve examining the second derivative, specifically the Hessian matrix, at critical points. For a minimum, the Hessian matrix should be positive definite, while for a maximum, it should be negative definite. This ensures that the critical point is a local minimum or maximum, respectively.

In summary, the necessary condition is that the gradient is zero at critical points, and the sufficient condition involves analyzing the Hessian matrix to determine the nature (minimum or maximum) of these critical points.