Statistics > Machine Learning
[Submitted on 28 Jun 2013]
Title:Memory Limited, Streaming PCA
Download PDFAbstract: We consider streaming, one-pass principal component analysis (PCA), in the high-dimensional regime, with limited memory. Here,p -dimensional samples are presented sequentially, and the goal is to produce thek -dimensional subspace that best approximates these points. Standard algorithms requireO(p2) memory; meanwhile no algorithm can do better thanO(kp) memory, since this is what the output itself requires. Memory (or storage) complexity is most meaningful when understood in the context of computational and sample complexity. Sample complexity for high-dimensional PCA is typically studied in the setting of the {\em spiked covariance model}, wherep -dimensional points are generated from a population covariance equal to the identity (white noise) plus a low-dimensional perturbation (the spike) which is the signal to be recovered. It is now well-understood that the spike can be recovered when the number of samples,n , scales proportionally with the dimension,p . Yet, all algorithms that provably achieve this, have memory complexityO(p2) . Meanwhile, algorithms with memory-complexityO(kp) do not have provable bounds on sample complexity comparable top . We present an algorithm that achieves both: it usesO(kp) memory (meaning storage of any kind) and is able to compute thek -dimensional spike withO(plogp) sample-complexity -- the first algorithm of its kind. While our theoretical analysis focuses on the spiked covariance model, our simulations show that our algorithm is successful on much more general models for the data.
Current browse context:
stat.ML
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer (What is the Explorer?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)