skip to main content
Skip header Section
Low Rank Approximation: Algorithms, Implementation, ApplicationsNovember 2011
Publisher:
  • Springer Publishing Company, Incorporated
ISBN:978-1-4471-2226-5
Published:18 November 2011
Pages:
266
Skip Bibliometrics Section
Bibliometrics
Skip Abstract Section
Abstract

Data Approximation by Low-complexity Models details the theory, algorithms, and applications of structured low-rank approximation. Efficient local optimization methods and effective suboptimal convex relaxations for Toeplitz, Hankel, and Sylvester structured problems are presented. Much of the text is devoted to describing the applications of the theory including: system and control theory; signal processing; computer algebra for approximate factorization and common divisor computation; computer vision for image deblurring and segmentation; machine learning for information retrieval and clustering; bioinformatics for microarray data analysis; chemometrics for multivariate calibration; and psychometrics for factor analysis. Software implementation of the methods is given, making the theory directly applicable in practice. All numerical examples are included in demonstration files giving hands-on experience and exercises and MATLAB examples assist in the assimilation of the theory.

Cited By

  1. Ali R, Bernardi G, van Waterschoot T and Moonen M (2019). Methods of Extending a Generalized Sidelobe Canceller With External Microphones, IEEE/ACM Transactions on Audio, Speech and Language Processing (TASLP), 27:9, (1349-1364), Online publication date: 1-Sep-2019.
  2. Xie K, Wang L, Wang X, Xie G, Wen J, Zhang G, Cao J, Zhang D, Xie K, Wang X, Zhang D, Cao J, Wang L, Xie G, Wen J and Zhang G (2018). Accurate Recovery of Internet Traffic Data, IEEE/ACM Transactions on Networking (TON), 26:2, (793-806), Online publication date: 1-Apr-2018.
  3. Deng G, Manton J and Wang S (2018). Fast Kernel Smoothing by a Low-Rank Approximation of the Kernel Toeplitz Matrix, Journal of Mathematical Imaging and Vision, 60:8, (1181-1195), Online publication date: 1-Oct-2018.
  4. Wen C, Shi G and Xie X (2017). Estimation of directions of arrival of multiple distributed sources for nested array, Signal Processing, 130:C, (315-322), Online publication date: 1-Jan-2017.
  5. Liu X, Li W and Wang H (2017). Rank constrained matrix best approximation problem with respect to (skew) Hermitian matrices, Journal of Computational and Applied Mathematics, 319:C, (77-86), Online publication date: 1-Aug-2017.
  6. ACM
    Giesbrecht M, Haraldson J and Labahn G Computing the Nearest Rank-Deficient Matrix Polynomial Proceedings of the 2017 ACM on International Symposium on Symbolic and Algebraic Computation, (181-188)
  7. ACM
    Lykourentzou I, Kraut R and Dow S Team Dating Leads to Better Online Ad Hoc Collaborations Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, (2330-2343)
  8. Kokabifar E, Loghmani G and Karbassi S (2019). Nearest matrix with prescribed eigenvalues and its applications, Journal of Computational and Applied Mathematics, 298:C, (53-63), Online publication date: 15-May-2016.
  9. ACM
    Xu K, Kim V, Huang Q, Mitra N and Kalogerakis E Data-driven shape analysis and processing SIGGRAPH ASIA 2016 Courses, (1-38)
  10. Markovsky I System Identification in the Behavioral Setting Proceedings of the 12th International Conference on Latent Variable Analysis and Signal Separation - Volume 9237, (235-242)
  11. Hage C and Kleinsteuber M Robust Structured Low-Rank Approximation on the Grassmannian Proceedings of the 12th International Conference on Latent Variable Analysis and Signal Separation - Volume 9237, (295-303)
  12. ACM
    Zhou X, Yang C, Zhao H and Yu W (2014). Low-Rank Modeling and Its Applications in Image Analysis, ACM Computing Surveys, 47:2, (1-33), Online publication date: 8-Jan-2015.
  13. ACM
    Henrion D, Naldi S and Safey El Din M Real Root Finding for Rank Defects in Linear Hankel Matrices Proceedings of the 2015 ACM on International Symposium on Symbolic and Algebraic Computation, (221-228)
  14. ACM
    Sun J, Xiong Y, Zhu Y, Liu J, Guan C and Xiong H Multi-source Information Fusion for Personalized Restaurant Recommendation Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval, (983-986)
  15. Balajewicz M and Farhat C (2014). Reduction of nonlinear embedded boundary models for problems with evolving interfaces, Journal of Computational Physics, 274:C, (489-504), Online publication date: 1-Oct-2014.
Contributors
  • Catalan Institution for Research and Advanced Studies

Recommendations

Corrado Mencar

Low rank approximation (LRA), a general approach for discovering linear models of data, applies to a wide range of problems in many disciplines, including computer science (CS) and engineering. It is therefore an instructive approach to teach in undergraduate courses, as well as a useful tool for practitioners and researchers. This book gently takes the reader from the basic ideas of LRA to the most critical concepts, with an adequate number of examples to explain things along the way. Readers are only asked to have some basic knowledge in linear algebra and some patience about the presence of code chunks, which may sometimes distract them from focusing on the key concepts. In fact, the text is frequently interleaved with MATLAB code according to so-called "literate programming." This choice has two opposing effects. On one hand, it is a precious aid for students who want to put into practice what they have learned. On the other hand, it confuses the overall format of the chapters-it is not always clear whether the code chunks can be safely skipped or are required to understand the contents of the text. The first chapter of the book is enlightening; it motivates LRA as a generalization of the classical paradigm of linear modeling, usually ascribable to least squares methods. Moreover, with a rich number of examples in areas such as system identification, signal processing, computer algebra, machine learning, and computer vision, the author clearly shows how immediate and useful the application of LRA is to data modeling. The subsequent chapters are more technical, but the author does try to make the key concepts comprehensible. Furthermore, all the code and a substantial appendix with problems and solutions are freely available online (http://homepages.vub.ac.be/~imarkovs/book.html). A good deal of the book is focused on system theory and signal processing. This makes the book particularly appealing for readers interested in engineering subjects. Nevertheless, readers more interested in computing-related subjects can focus on the chapters related to missing data filling, data centering, nonlinear and constrained data modeling, and so on. Unfortunately, only two lines in the whole book are devoted to nonnegative matrix factorization, which falls in the realm of LRA and is of significant importance in CS. Overall, Markovsky has presented LRA in a way that is unifying and cross-disciplinary. The pages abound with code, examples, applications, and problems, from which readers can pick according to their own interests and without the risk of losing the main thread of the book. Even though the book could be expanded to better cover computing-related topics, it is a good reference for students, practitioners, and researchers. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.