# proof

To prove $\lambda_{\max}(\sum_{j \in I_k \cup \{i\}} a_ja_j^T) \geq \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + (x_k^Ta_i)^2$ we begin by finding the subgradient of the function $\lambda_{\max}(\cdot)$ at the matrix $X \in S_n$. To do this we must first define the subgradient.

Definition: A matrix $V \in R^{n \times n}$ is a subgradiant of a convex function $f: R^{n\times n} \rightarrow R$ at $X$ if

$f(Y) \geq f(X) + \left\langle V, Y-X \right\rangle \; \forall \; Y \in \textrm{dom}(f)$

where $\left\langle \cdot \right\rangle$ is the inner product.

The function we are interested in is $\lambda_{\max}(\cdot)$. In order to define the subgradiant of this function we must first ensure it is convex. If $X \in S_n$ then

$\lambda_{\max}(X) = \max_{Y\succeq 0, \textrm{trace}(Y) = 1} \textrm{trace}(YX).$

Thus $\,\lambda_{\max}(X)$ is the maximum of a collection of linear functions which implies that $\lambda_{\max}(\cdot)$ is a convex function.

So, according to our definition of subgradiant we are looking for a matrix $\,V$ such that

$\lambda_{\max}(Y) \geq \lambda_{\max}(X) + \left\langle V, Y-X \right\rangle.$

Since the trace defines an inner product on the space of symmetric matrices we can rewrite this problem as

$\lambda_{\max}(Y) \geq \lambda_{\max}(X) + \textrm{trace}(V(Y-X)).$

Suppose $\,q \in R^n$ is the eigenvector of $\,X$ associated with the largest eigenvalue and $\,\|q\|_2 = 1$. Since $\lambda_{\max}(Y) \geq \textrm{trace}(Yqq^T)$ we have

\begin{align} \lambda_{\max}(Y) & \geq \textrm{trace}(Yqq^T) = \textrm{trace}(Xqq^T) + \textrm{trace}((Y-X)qq^T) \\ &= \lambda_{\max}(X) + \textrm{trace}({(Y-X)qq^T})\\ &= \lambda_{\max}(X) + \textrm{trace}(q^T(Y-X)q) \end{align}

Thus the subgradient is $\,V = qq^T$ where $q \in R^n$ is the eigenvector of $\,X$ associated with the largest eigenvalue and $\|q\|_2 = 1$.

Now if we let $X = \sum_{j \in I_k} a_ja_j^T$ and $Y = \sum_{j \in I_k \cup \{i\}} a_ja_j^T$ and $\,x_k$ be the eigenvector of $\sum_{j \in I_k} a_ja_j^T$ associated with the largest eigenvalue then we have

\begin{align} \lambda_{\max}(\sum_{j \in I_k \cup \{i\}} a_ja_j^T) & \geq \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + \textrm{trace}(x_i^T(\sum_{j \in I_k \cup \{i\}} a_ja_j^T-\sum_{j \in I_k \cup \{i\}} a_ja_j^T)x_i) \\ & = \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + \textrm{trace}(x_k^T(a_ia_i^T)x_i) \\ & = \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + (x_k^Ta_i)^2 \end{align}

Thus, $\lambda_{\max}(\sum_{j \in I_k \cup \{i\}} a_ja_j^T) \geq \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + (x_k^Ta_i)^2$.