Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(13)
Clustering Method
Compact Representation
Coordinate System
Dimensional Reduction
Exploratory Data Analysis
High Dimensional Data
High Dimensionality
Local Minima
Local Linear Embedding
Multivariate Data
Nonlinear Dimensionality Reduction
Unsupervised Learning
Neighborhood Preserving Embedding
Related Publications
(207)
A global geometric framework for nonlinear dimensionality reduction
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering
Hessian Eigenmaps: new locally linear embedding techniques for highdimensional data
Nonlinear Component Analysis as a Kernel Eigenvalue Problem
Subscribe
Academic
Publications
Nonlinear dimensionality reduction by locally linear embedding
Nonlinear dimensionality reduction by locally linear embedding,10.1126/science.290.5500.2323,Science,Sam T. Roweis,Lawrence K. Saul
Edit
Nonlinear dimensionality reduction by locally linear embedding
(
Citations: 2936
)
BibTex

RIS

RefWorks
Download
Sam T. Roweis
,
Lawrence K. Saul
Many areas of science depend on
exploratory data analysis
and visualization. The need to analyze large amounts of
multivariate data
raises the fundamental problem of dimensionality reduction: how to discover compact representations of highdimensional data. Here, we introduce locally linear embedding (LLE), an
unsupervised learning
algorithm that computes lowdimensional, neighborhoodpreserving embeddings of highdimensional inputs. Unlike clustering methods for local dimensionality reduction, LLE maps its inputs into a single global
coordinate system
of lower dimensionality, and its optimizations do not involve local minima. By exploiting the local symmetries of linear reconstructions, LLE is able to learn the global structure of nonlinear manifolds, such as those generated by images of faces or documents of text.
Journal:
Science
, vol. 290, no. 5500, pp. 23232326, 2000
DOI:
10.1126/science.290.5500.2323
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
www.cs.cmu.edu
)
(
www.sciencemag.org
)
Citation Context
(1940)
...[
34
], AE minimizes the L2 distance in the complex domain between the embedding Za and its neighborhood average Za from all the as, weighted by its total confidence Da;a:...
Stella Yu
,
et al.
Angular Embedding: A Robust Quadratic Criterion
...(
2000
), Roweis and Saul (
2000
), and Huo and Chen (
2002
)); gradient based methods (Novikov et al...
Christopher R. Genovese
,
et al.
The Geometry of Nonparametric Filament Estimation
...Another group of techniques is based on the eigenanalysis of graphs and kernels related to the local structure of the data in the manifold (Tenenbaum, Silva, & Langford,
2000
; Schölkopf, Smola, & Müller,
1998
; Weinberger & Saul,
2004
), or on sparse matrices describing the local topology of the data (Roweis & Saul,
2000
; Belkin & Niyogi,
2002
)...
Valero Laparra
,
et al.
Nonlinearities and Adaptation of Color Vision from Sequential Principa...
...Spectral dimensionreduction techniques such as locally linear embedding (LLE) (Roweis & Saul,
2000
), Isomap (Tenenbaum, de Silva, & Langford,
2000
), and Laplacian eigenmaps (Belkin & Niyogi,
2003
) rely on the spectrum of the neighborhood graph of the data and preserve important properties of this graph...
Kerstin Bunte
,
et al.
A General Framework for DimensionalityReducing Data Visualization Map...
...Some of the more popular methods are: isometric feature mapping (Isomap), see Tenenbaum, de Silva, and Langford (
2000
); locally linear embedding, see Roweis and Saul (
2000
); and Laplacian and Hessian eigenmaps, see Belkin and Niyogi (
2002
)...
Giseon Heo
,
et al.
Topological Analysis of Variance and the Maxillary Complex
References
(3)
Selforganization and associative memory
(
Citations: 4080
)
T. Kohonen
Published in 1989.
Principal Component Analysis Springer Verlag
(
Citations: 49
)
I. T. Jolliffe
Published in 1986.
Matrix analysis
(
Citations: 1137
)
Roger A. Horn
,
Charles R. Johnson
Published in 1990.
Sort by:
Citations
(2936)
Angular Embedding: A Robust Quadratic Criterion
Stella Yu
,
Quadratic Criterion
Journal:
IEEE Transactions on Pattern Analysis and Machine Intelligence  PAMI
, vol. 34, no. 1, pp. 158173, 2012
The Geometry of Nonparametric Filament Estimation
Christopher R. Genovese
,
Marco PeronePacifico
,
Isabella Verdinelli
,
Larry Wasserman
Journal:
Journal of The American Statistical Association  J AMER STATIST ASSN
, vol. justaccep, no. justaccep, 2012
Nonlinearities and Adaptation of Color Vision from Sequential Principal Curves Analysis
Valero Laparra
,
Sandra Jiménez
,
Gustavo CampsValls
,
Jesús Malo
Journal:
Neural Computation  NECO
, vol. 24, no. 10, pp. 27512788, 2012
A General Framework for DimensionalityReducing Data Visualization Mapping
Kerstin Bunte
,
Michael Biehl
,
Barbara Hammer
Journal:
Neural Computation  NECO
, vol. 24, no. 3, pp. 771804, 2012
Topological Analysis of Variance and the Maxillary Complex
Giseon Heo
,
Jennifer Gamble
,
Peter T. Kim
Journal:
Journal of The American Statistical Association  J AMER STATIST ASSN
, vol. justaccep, no. justaccep, 2012