Unsupervised Learning: Foundations of Neural Computation

Pirmais vāks
Geoffrey E. Hinton, Terrence Joseph Sejnowski
MIT Press, 1999 - 398 lappuses
Introduction; Unsupervised learning; Local synaptic learning rules suffice to maximize mutual information in a linear network; Convergent algorithm for sensory receptive field development; Emergence of position-independent detectors of snese of rotation and dilation with hebbian learning: an analysis; Learning invariance from transformation sequences; Learning perceptually salient visual parameters using spatiotemporal smoothness constraints; Wht is the goal of sensory coding?; An information-maximization approach to blind separation and blid deconvolution; Natural gradient works efficiently in learning; A fast fixed-point algorithm for independent component analysis; Feature extraction using an unsupervised neural network; Learning mixture models of spatial coherence; Baynesian self-organization driven byprior probability distributions; Finding minimum entropy codes; Learning population codes by minimizing description lengththe Helmholtz machine; factor analysis using delta-rule wake-sleep learning; Dimension reduction by local principal component analysis; A resource-allocating network for function interpolation; 20. Learning with preknowledge: clustering with point and graph matching distance measures; 21. Learning to generalize from single examples in the dynamic ling architecture; Index.

No grāmatas satura

Lietotāju komentāri - Rakstīt atsauksmi

Ierastajās vietās neesam atraduši nevienu atsauksmi.

Saturs

Local Synaptic Learning Rules Suffice to Maximize Mutual Information
19
Emergence of PositionIndependent Detectors of Sense of Rotation
47
Learning Invariance from Transformation Sequences
63
What Is the Goal of Sensory Coding?
101
An InformationMaximization Approach to Blind Separation and Blind
145
Natural Gradient Works Efficiently in Learning
177
A Fast FixedPoint Algorithm for Independent Component Analysis
203
Learning Mixture Models of Spatial Coherence
223
Finding Minimum Entropy Codes
249
Factor Analysis Using DeltaRule WakeSleep Learning
293
Dimension Reduction by Local Principal Component Analysis
317
A ResourceAllocating Network for Function Interpolation
341
Clustering with Point and Graph
355
Learning to Generalize from Single Examples in the Dynamic Link
373
Index
391
Autortiesības

Citi izdevumi - Skatīt visu

Bieži izmantoti vārdi un frāzes

Par autoru (1999)

Geoffrey Hinton is Professor of Computer Science at the University of Toronto.

Terrence J. Sejnowski holds the Francis Crick Chair at the Salk Institute for Biological Studies and is a Distinguished Professor at the University of California, San Diego. He was a member of the advisory committee for the Obama administration's BRAIN initiative and is President of the Neural Information Processing (NIPS) Foundation. He is the author of The Deep Learning Revolution (MIT Press) and other books.

Bibliogrāfiskā informācija