Unsupervised Learning: Foundations of Neural Computation

Pirmais vāks
Geoffrey E. Hinton, Terrence Joseph Sejnowski
MIT Press, 1999 - 398 lappuses
Introduction; Unsupervised learning; Local synaptic learning rules suffice to maximize mutual information in a linear network; Convergent algorithm for sensory receptive field development; Emergence of position-independent detectors of snese of rotation and dilation with hebbian learning: an analysis; Learning invariance from transformation sequences; Learning perceptually salient visual parameters using spatiotemporal smoothness constraints; Wht is the goal of sensory coding?; An information-maximization approach to blind separation and blid deconvolution; Natural gradient works efficiently in learning; A fast fixed-point algorithm for independent component analysis; Feature extraction using an unsupervised neural network; Learning mixture models of spatial coherence; Baynesian self-organization driven byprior probability distributions; Finding minimum entropy codes; Learning population codes by minimizing description lengththe Helmholtz machine; factor analysis using delta-rule wake-sleep learning; Dimension reduction by local principal component analysis; A resource-allocating network for function interpolation; 20. Learning with preknowledge: clustering with point and graph matching distance measures; 21. Learning to generalize from single examples in the dynamic ling architecture; Index.

No grāmatas satura

Lietotāju komentāri - Rakstīt atsauksmi

Ierastajās vietās neesam atraduši nevienu atsauksmi.


Local Synaptic Learning Rules Suffice to Maximize Mutual Information
Emergence of PositionIndependent Detectors of Sense of Rotation
Learning Invariance from Transformation Sequences
What Is the Goal of Sensory Coding?
An InformationMaximization Approach to Blind Separation and Blind
Natural Gradient Works Efficiently in Learning
A Fast FixedPoint Algorithm for Independent Component Analysis
Learning Mixture Models of Spatial Coherence
Finding Minimum Entropy Codes
Factor Analysis Using DeltaRule WakeSleep Learning
Dimension Reduction by Local Principal Component Analysis
A ResourceAllocating Network for Function Interpolation
Clustering with Point and Graph
Learning to Generalize from Single Examples in the Dynamic Link

Citi izdevumi - Skatīt visu

Bieži izmantoti vārdi un frāzes

Par autoru (1999)

Geoffrey Hinton is Professor of Computer Science at the University of Toronto.

Terrence J. Sejnowski holds the Francis Crick Chair at the Salk Institute for Biological Studies and is a Distinguished Professor at the University of California, San Diego. He was a member of the advisory committee for the Obama administration's BRAIN initiative and is President of the Neural Information Processing (NIPS) Foundation. He is the author of The Deep Learning Revolution (MIT Press) and other books.

Bibliogrāfiskā informācija