There are four types of convex programming problems −
Step 1 − $min \:f\left ( x \right )$, where $x \in S$ and S be a non-empty convex set in $\mathbb{R}^n$ and $f\left ( x \right )$ is convex function.
Step 2 − $min \: f\left ( x \right ), x \in \mathbb{R}^n$ subject to
$g_i\left ( x \right ) \geq 0, 1 \leq m_1$ and $g_i\left ( x \right )$ is a convex function.
$g_i\left ( x \right ) \leq 0,m_1+1 \leq m_2$ and $g_i\left ( x \right )$ is a concave function.
$g_i\left ( x \right ) = 0, m_2+1 \leq m$ and $g_i\left ( x \right )$ is a linear function.
where $f\left ( x \right )$ is a convex fucntion.
Step 3 − $max \:f\left ( x \right )$ where $x \in S$ and S be a non-empty convex set in $\mathbb{R}^n$ and $f\left ( x \right )$ is concave function.
Step 4 − $min \:f\left ( x \right )$, where $x \in \mathbb{R}^n$ subject to
$g_i\left ( x \right ) \geq 0, 1 \leq m_1$ and $g_i\left ( x \right )$ is a convex function.
$g_i\left ( x \right ) \leq 0, m_1+1 \leq m_2$ and $g_i\left ( x \right )$ is a concave function.
$g_i\left ( x \right ) = 0,m_2+1 \leq m$ and $g_i\left ( x \right )$ is a linear function.
where $f\left ( x \right )$ is a concave function.
Let S be a non-empty set in $\mathbb{R}^n$ and let $\hat{x} \in \:Closure\left ( S \right )$, then the cone of feasible direction of S at $\hat{x}$, denoted by D, is defined as $D=\left \{ d:d\neq 0,\hat{x}+\lambda d \in S, \lambda \in \left ( 0, \delta \right ), \delta > 0 \right \}$
Each non-zero vector $d \in D$ is called feasible direction.
For a given function $f:\mathbb{R}^n \Rightarrow \mathbb{R}$ the cone of improving direction at $\hat{x}$ is denoted by F and is given by
$$F=\left \{ d:f\left ( \hat{x}+\lambda d \right )\leq f\left ( \hat{x} \right ),\forall \lambda \in \left ( 0,\delta \right ), \delta >0 \right \}$$
Each direction $d \in F$ is called an improving direction or descent direction of f at $\hat{x}$
Necessary Condition
Consider the problem $min f\left ( x \right )$ such that $x \in S$ where S be a non-empty set in $\mathbb{R}^n$. Suppose f is differentiable at a point $\hat{x} \in S$. If $\hat{x}$ is a local optimal solution, then $F_0 \cap D= \phi$ where $F_0=\left \{ d:\bigtriangledown f\left ( \hat{x} \right )^T d < 0 \right \}$ and D is a cone of feasible direction.
Sufficient Condition
If $F_0 \cap D= \phi$ f is a pseudoconvex function at $\hat{x}$ and there exists a neighbourhood of $\hat{x},N_\varepsilon \left ( \hat{x} \right ), \varepsilon > 0$ such that $d=x-\hat{x}\in D$ for any $x \in S \cap N_\varepsilon \left ( \hat{x} \right )$, then $\hat{x}$ is local optimal solution.
Necessary Condition
Let $F_0 \cap D\neq \phi$, ie, there exists a $d \in F_0 \cap D$ such that $d \in F_0$ and $d\in D$
Since $d \in D$, therefore there exists $\delta_1 >0$ such that $\hat{x}+\lambda d \in S, \lambda \in \left ( 0,\delta_1 \right ).$
Since $d \in F_0$, therefore $\bigtriangledown f \left ( \hat{x}\right )^T d <0$
Thus, there exists $\delta_2>0$ such that $f\left ( \hat{x}+\lambda d\right )< f\left ( \hat{x}\right ),\forall \lambda \in f \left ( 0,\delta_2 \right )$
Let $\delta=min \left \{\delta_1,\delta_2 \right \}$
Then $\hat{x}+\lambda d \in S, f\left (\hat{x}+\lambda d \right ) < f\left ( \hat{x} \right ),\forall \lambda \in f \left ( 0,\delta \right )$
But $\hat{x}$ is local optimal solution.
Hence it is contradiction.
Thus $F_0\cap D=\phi$
Sufficient Condition
Let $F_0 \cap D\neq \phi$ nd let f be a pseudoconvex function.
Let there exists a neighbourhood of $\hat{x}, N_{\varepsilon}\left ( \hat{x} \right )$ such that $d=x-\hat{x}, \forall x \in S \cap N_\varepsilon\left ( \hat{x} \right )$
Let $\hat{x}$ is not a local optimal solution.
Thus, there exists $\bar{x} \in S \cap N_\varepsilon \left ( \hat{x} \right )$ such that $f \left ( \bar{x} \right )< f \left ( \hat{x} \right )$
By assumption on $S \cap N_\varepsilon \left ( \hat{x} \right ),d=\left ( \bar{x}-\hat{x} \right )\in D$
By pseudoconvex of f,
$$f\left ( \hat{x} \right )>f\left ( \bar{x} \right )\Rightarrow \bigtriangledown f\left ( \hat{x} \right )^T\left ( \bar{x}-\hat{x} \right )<0$$
$\Rightarrow \bigtriangledown f\left ( \hat{x} \right) ^T d <0$
$\Rightarrow d \in F_0$
hence $F_0\cap D \neq \phi$
which is a contradiction.
Hence, $\hat{x}$ is local optimal solution.
Consider the following problem:$min \:f\left ( x\right )$ where $x \in X$ such that $g_x\left ( x\right ) \leq 0, i=1,2,...,m$
$f:X \rightarrow \mathbb{R},g_i:X \rightarrow \mathbb{R}^n$ and X is an open set in $\mathbb{R}^n$
Let $S=\left \{x:g_i\left ( x\right )\leq 0,\forall i \right \}$
Let $\hat{x} \in X$, then $M=\left \{1,2,...,m \right \}$
Let $I=\left \{i:g_i\left ( \hat{x}\right )=0, i \in M\right \}$ where I is called index set for all active constraints at $\hat{x}$
Let $J=\left \{i:g_i\left ( \hat{x}\right )<0,i \in M\right \}$ where J is called index set for all active constraints at $\hat{x}$.
Let $F_0=\left \{ d \in \mathbb{R}^m:\bigtriangledown f\left ( \hat{x} \right )^T d <0 \right \}$
Let $G_0=\left \{ d \in \mathbb{R}^m:\bigtriangledown gI\left ( \hat{x} \right )^T d <0 \right \}$
or $G_0=\left \{ d \in \mathbb{R}^m:\bigtriangledown gi\left ( \hat{x} \right )^T d <0 ,\forall i \in I \right \}$
If $S=\left \{ x \in X:g_i\left ( x\right ) \leq 0, \forall i \in I\right \}$ and X is non-empty open set in $\mathbb{R}^n$. Let $\hat{x}\in S$ and $g_i$ are different at $\hat{x}, i \in I$ and let $g_i$ where $i \in J$ are continuous at $\hat{x}$, then $G_0 \subseteq D$.
Let $d \in G_0$
Since $\hat{x} \in X$ and X is an open set, thus there exists $\delta_1 >0$ such that $\hat{x}+\lambda d \in X$ for $\lambda \in \left ( 0, \delta_1\right )$
Also since $g_\hat{x}<0$ and $g_i$ are continuous at $\hat{x}, \forall i \in J$, there exists $\delta_2>0$, $g_i\left ( \hat{x}+\lambda d\right )<0, \lambda \in \left ( 0, \delta_2\right )$
Since $d \in G_0$, therefore, $\bigtriangledown g_i\left ( \hat{x}\right )^T d <0, \forall i \in I$ thus there exists $\delta_3 >0, g_i\left ( \hat{x}+\lambda d\right )< g_i\left ( \hat{x}\right )=0$, for $\lambda \in \left ( 0, \delta_3\right ) i \in J$
Let $\delta=min\left \{ \delta_1, \delta_2, \delta_3 \right \}$
therefore, $\hat{x}+\lambda d \in X, g_i\left ( \hat{x}+\lambda d\right )< 0, i \in M$
$\Rightarrow \hat{x}+\lambda d \in S$
$\Rightarrow d \in D$
$\Rightarrow G_0 \subseteq D$
Hence Proved.
Necessary Condition
Let $f$ and $g_i, i \in I$, are different at $\hat{x} \in S,$ and $g_j$ are continous at $\hat{x} \in S$. If $\hat{x}$ is local minima of $S$, then $F_0 \cap G_0=\phi$.
Sufficient Condition
If $F_0 \cap G_0= \phi$ and f is a pseudoconvex function at $\left (\hat{x}, g_i 9x \right ), i \in I$ are strictly pseudoconvex functions over some $\varepsilon$ - neighbourhood of $\hat{x}, \hat{x}$ is a local optimal solution.
Let $\hat{x}$ be a feasible point such that $\bigtriangledown f\left(\hat{x} \right)=0$, then $F_0 = \phi$. Thus, $F_0 \cap G_0= \phi$ but $\hat{x}$ is not an optimal solution
But if $\bigtriangledown g\left(\hat{x} \right)=0$, then $G_0=\phi$, thus $F_0 \cap G_0= \phi$
Consider the problem : min $f\left(x \right)$ such that $g\left(x \right)=0$.
Since $g\left(x \right)=0$, thus $g_1\left(x \right)=g\left(x \right)<0$ and $g_2\left(x \right)=-g\left(x \right) \leq 0$.
Let $\hat{x} \in S$, then $g_1\left(\hat{x} \right)=0$ and $g_2\left(\hat{x} \right)=0$.
But $\bigtriangledown g_1\left(\hat{x} \right)= - \bigtriangledown g_2\left(\hat{x}\right)$
Thus, $G_0= \phi$ and $F_0 \cap G_0= \phi$.