skip to main content
10.1145/1273496.1273584acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Regression on manifolds using kernel dimension reduction

Published:20 June 2007Publication History

ABSTRACT

We study the problem of discovering a manifold that best preserves information relevant to a nonlinear regression. Solving this problem involves extending and uniting two threads of research. On the one hand, the literature on sufficient dimension reduction has focused on methods for finding the best linear subspace for nonlinear regression; we extend this to manifolds. On the other hand, the literature on manifold learning has focused on unsupervised dimensionality reduction; we extend this to the supervised setting. Our approach to solving the problem involves combining the machinery of kernel dimension reduction with Laplacian eigenmaps. Specifically, we optimize cross-covariance operators in kernel feature spaces that are induced by the normalized graph Laplacian. The result is a highly flexible method in which no strong assumptions are made on the regression function or on the distribution of the covariates. We illustrate our methodology on the analysis of global temperature data and image manifolds.

References

  1. Belkin, M., & Niyogi, P. (2003). Laplacian Eigenmaps for dimensionality reduction and data representation. Neural Computation, 15, 1373--1396. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Belkin, M., Niyogi, P., & Sindhwani, V. (2006). Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 7, 2399--2434. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bengio, Y., Paiement, J.-F., & Vincent, P. (2003). Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and spectral clustering (Technical Report 1238). Département d'Informatique et Recherche Opérationnelle, Université de Montréal.Google ScholarGoogle Scholar
  4. Chiaromonte, F., & Cook, R. D. (2002). Sufficient dimension reduction and graphics in regression. Annals of the Institute of Statistical Mathematics, 54, 768--795.Google ScholarGoogle ScholarCross RefCross Ref
  5. Coifman, R., Lafon, S., Lee, A., Maggioni, M., Nadler, B., Warner, F., & Zucker, S. (2005). Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. Proceedings of the National Academy of Sciences USA, 102, 7426--7431.Google ScholarGoogle ScholarCross RefCross Ref
  6. Cook, R. D. (1998). Regression graphics. Wiley Inter-Science.Google ScholarGoogle Scholar
  7. Cook, R. D., & Li, B. (1991). Discussion of Li (1991). Journal of the American Statistical Association, 86, 328--332.Google ScholarGoogle Scholar
  8. Cook, R. D., & Yin, X. (2001). Dimension reduction and visualization in discriminant analysis (with discussion). Australian & New Zealand Journal of Statistics, 43, 147--199.Google ScholarGoogle ScholarCross RefCross Ref
  9. Donoho, D. L., & Grimes, C. (2003). Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences USA, 100, 5591--5596.Google ScholarGoogle ScholarCross RefCross Ref
  10. Fukumizu, K., Bach, F. R., & Jordan, M. I. (2004). Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces. Journal of Machine Learning Research, 5, 73--99. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Fukumizu, K., Bach, F. R., & Jordan, M. I. (2006). Kernel dimension reduction in regression (Technical Report). Department of Statistics, University of California, Berkeley.Google ScholarGoogle Scholar
  12. Globerson, A., & Tishby, N. (2003). Sufficient dimensionality reduction. Journal of Machine Learning Research, 3, 1307--1331. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Ham, J., Lee, D., Mika, S., & Schöölkopf, B. (2004). A kernel view of the dimensionality reduction of manifolds. Proceedings of the 21'st International Conference on Machine Learning. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Lafon, S. (2004). Diffusion maps and geometric harmonics. Doctoral dissertation, Yale University.Google ScholarGoogle Scholar
  15. Li, B., Zha, H., & Chiaramonte, F. (2005). Contour regression: A general approach to dimension reduction. The Annals of Statistics, 33, 1580--1616.Google ScholarGoogle ScholarCross RefCross Ref
  16. Li, K.-C. (1991). Sliced inverse regresion for dimension reduction. Journal of the American Statistical Association, 86, 316--327.Google ScholarGoogle ScholarCross RefCross Ref
  17. Li, K.-C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein's lemma. Journal of the American Statistical Association, 86, 316--342.Google ScholarGoogle ScholarCross RefCross Ref
  18. Remote Sensing Systems (2004). Microwave sounding units (MSU) data. sponsored by the NOAA Climate and Global Change Program. Data available at www.remss.com.Google ScholarGoogle Scholar
  19. Roweis, S., & Saul, L. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science, 290, 2323--2326.Google ScholarGoogle ScholarCross RefCross Ref
  20. Sajama, & Orlitsky, A. (2005). Supervised dimensionality reduction using mixture models. Proceedings of the 22'nd International Conference on Machine Learning (pp. 760--767). ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Sha, F., & Saul, L. (2005). Analysis and extension of spectral methods for nonlinear dimensionality reduction. Proceedings of the 22'nd International Conference on Machine Learning (pp. 785--792). ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Tenenbaum, J., de Silva, V., & Langford, J. (2000). A global geometric framework for nonlinear dimensionality reduction. Science, 290, 2319--2322.Google ScholarGoogle ScholarCross RefCross Ref
  23. Yang, X., Fu, H., Zha, H., & Barlow, J. (2006). Semi-supervised nonlinear dimensionality reduction. Proceedings of the 23'rd International Conference on Machine Learning (pp. 1065--1072). ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  1. Regression on manifolds using kernel dimension reduction

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICML '07: Proceedings of the 24th international conference on Machine learning
      June 2007
      1233 pages
      ISBN:9781595937933
      DOI:10.1145/1273496

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 20 June 2007

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate140of548submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader