proof
To prove [math]\lambda_{\max}(\sum_{j \in I_k \cup \{i\}} a_ja_j^T) \geq \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + (x_k^Ta_i)^2[/math] we begin by finding the subgradient of the function [math]\lambda_{\max}(\cdot)[/math] at the matrix [math]X \in S_n[/math]. To do this we must first define the subgradient.
Definition:
A matrix [math]V \in R^{n \times n}[/math] is a subgradiant of a convex function [math]f: R^{n\times n} \rightarrow R[/math] at [math]X[/math] if
where [math]\left\langle \cdot \right\rangle[/math] is the inner product.
The function we are interested in is [math]\lambda_{\max}(\cdot)[/math]. In order to define the subgradiant of this function we must first ensure it is convex. If [math]X \in S_n[/math] then
Thus [math]\,\lambda_{\max}(X)[/math] is the maximum of a collection of linear functions which implies that [math]\lambda_{\max}(\cdot)[/math] is a convex function.
So, according to our definition of subgradiant we are looking for a matrix [math]\,V[/math] such that
Since the trace defines an inner product on the space of symmetric matrices we can rewrite this problem as
Suppose [math]\,q \in R^n[/math] is the eigenvector of [math]\,X[/math] associated with the largest eigenvalue and [math]\,\|q\|_2 = 1[/math]. Since [math]\lambda_{\max}(Y) \geq \textrm{trace}(Yqq^T)[/math] we have
Thus the subgradient is [math]\,V = qq^T[/math] where [math]q \in R^n[/math] is the eigenvector of [math]\,X[/math] associated with the largest eigenvalue and [math]\|q\|_2 = 1[/math].
Now if we let [math]X = \sum_{j \in I_k} a_ja_j^T[/math] and [math]Y = \sum_{j \in I_k \cup \{i\}} a_ja_j^T[/math] and [math]\,x_k[/math] be the eigenvector of [math]\sum_{j \in I_k} a_ja_j^T[/math] associated with the largest eigenvalue then we have
Thus, [math]\lambda_{\max}(\sum_{j \in I_k \cup \{i\}} a_ja_j^T) \geq \lambda_{\max}(\sum_{j \in I_k} a_ja_j^T) + (x_k^Ta_i)^2[/math].