Principal Manifolds for Data Visualization and Dimension ReductionAlexander N. Gorban, Balázs Kégl, Donald C. Wunsch, Andrei Zinovyev Springer Science & Business Media, 2007. gada 11. sept. - 340 lappuses In 1901, Karl Pearson invented Principal Component Analysis (PCA). Since then, PCA serves as a prototype for many other tools of data analysis, visualization and dimension reduction: Independent Component Analysis (ICA), Multidimensional Scaling (MDS), Nonlinear PCA (NLPCA), Self Organizing Maps (SOM), etc. The book starts with the quote of the classical Pearson definition of PCA and includes reviews of various methods: NLPCA, ICA, MDS, embedding and clustering algorithms, principal manifolds and SOM. New approaches to NLPCA, principal manifolds, branching principal components and topology preserving mappings are described as well. Presentation of algorithms is supplemented by case studies, from engineering to astronomy, but mostly of biological data: analysis of microarray and metabolite data. The volume ends with a tutorial "PCA and K-means decipher genome". The book is meant to be useful for practitioners in applied data analysis in life sciences, engineering, physics and chemistry; it will also be valuable to PhD students and researchers in computer sciences, applied mathematics and statistics. |
No grāmatas satura
1.–5. rezultāts no 73.
... vector by its projection on a best fitted lowdimensional linear manifold, the K-means approach gives an approximation of a big data set by K best fitted centroids. Between the “most rigid” linear manifolds and “most soft” unstructured ...
... vector z ∈ RN, to produce statistically independent score variables, stored in t ∈ Rn, n ≤ N: t = PTz . (1.1) Here, P is a transformation matrix, constructed from orthonormal column vectors. Since the first applications of PCA [21] ...
... vectors zi € RN and C]- Q RK, respectively, Z can be rewritten as shown below: Z1T ZQT ZST Z: 1T :[c1c2c3~--cj-~cN ... vector tT I (t1 t2 t3 - - - tj - - - tn ), t Q R" has the following first and second order statistics: E{t}I0 E{ttT} ...
... vector p 1,, associated with A1,, stores the kth set of coefficients to obtain the kth linear transformation of the original variable set z to produce tk. Furthermore, given that S Z Z is a positive definite or semidefinite matrix it ...
... vectors to produce curves that approximate the nonlinear relationship between a set of two variables. Such curves, defined as principal. Fig. 1.4. Benchmarking of the residual variances against accuracy bounds of each disjunt region Fig ...
Saturs
1 | |
References | 39 |
References | 65 |
References | 91 |
References | 127 |
The Iterative Extraction Approach to Clustering | 151 |
References | 174 |
Components | 192 |
Principal Trees | 219 |
of Bacterial Genomes | 229 |
Diffusion Maps a Probabilistic Interpretation for Spectral | 238 |
On Bounds for Diffusion Discrepancy and Fill Distance | 261 |
References | 269 |
Dimensionality Reduction and Microarray Data | 293 |
References | 307 |
PCA and KMeans Decipher Genome | 309 |
Citi izdevumi - Skatīt visu
Principal Manifolds for Data Visualization and Dimension Reduction Alexander N. Gorban,Balázs Kégl,Donald C. Wunsch,Andrei Zinovyev Ierobežota priekšskatīšana - 2007 |
Principal Manifolds for Data Visualization and Dimension Reduction Alexander N. Gorban,Balázs Kégl,Donald C. Wunsch,Andrei Zinovyev Priekšskatījums nav pieejams - 2009 |