I have an $ N$ -dimensional vector of data, say $ X_{t}$ , with $ 1 \leq t \leq T$ .

Of this vector $ X_{t}$ , I want to consider sub-vectors, say $ X_{t}^{b}$ , which are $ m$ -dimensional combinations of elements of the original vector $ X_{t}$ . In total, I have $ {N}\choose{m}$ of such combinations.

Note that $ N$ can be a large number; I need to allow for $ N \rightarrow \infty$ . Also, I will choose $ m$ in such a way that $ m$ is “smaller” than $ N$ , e.g. $ m=O(N^{1/2})$ .

I want to compute the $ k$ -th eigenvalue ($ k$ is user-defined) of the second moment matrix of each of these $ X_{t}^{b}$ s, i.e.

$ \lambda_{k}^{b} \left( \frac{1}{T} \sum_{t=1}^{T} X_{t}^{b} (X_{t}^{b})’ \right).$

Then, finally, I want to compute

$ \max_{1 \leq b \leq B} \lambda_{k}^{b}$ ,

where $ B$ is defined as $ {N}\choose{m}$ . In principle, I may need to compute also other measures such as the average of $ \lambda_{k}^{b}$ .

I have two questions:

- Is this problem NP-hard? Is there a reference to back up either statement (i.e. “it is” or “it is not”)?
- Is there some heuristic to make the problem less computationally burdensome? Is there any way to prove, for a given heuristic, that the solution found by the heuristic is “very close” to $ \max_{1 \leq b \leq B} \lambda_{k}^{b}$ – e.g. by showing that the solution found by the heuristic and $ \max_{1 \leq b \leq B} \lambda_{k}^{b}$ are equal almost surely, or something similar?

Many many thanks for your help.