Unsupervised Learning: Foundations of Neural Computation
Introduction; Unsupervised learning; Local synaptic learning rules suffice to maximize mutual information in a linear network; Convergent algorithm for sensory receptive field development; Emergence of position-independent detectors of snese of rotation and dilation with hebbian learning: an analysis; Learning invariance from transformation sequences; Learning perceptually salient visual parameters using spatiotemporal smoothness constraints; Wht is the goal of sensory coding?; An information-maximization approach to blind separation and blid deconvolution; Natural gradient works efficiently in learning; A fast fixed-point algorithm for independent component analysis; Feature extraction using an unsupervised neural network; Learning mixture models of spatial coherence; Baynesian self-organization driven byprior probability distributions; Finding minimum entropy codes; Learning population codes by minimizing description lengththe Helmholtz machine; factor analysis using delta-rule wake-sleep learning; Dimension reduction by local principal component analysis; A resource-allocating network for function interpolation; 20. Learning with preknowledge: clustering with point and graph matching distance measures; 21. Learning to generalize from single examples in the dynamic ling architecture; Index.
Lietotāju komentāri - Rakstīt atsauksmi
Ierastajās vietās neesam atraduši nevienu atsauksmi.
Local Synaptic Learning Rules Suffice to Maximize Mutual Information
Emergence of PositionIndependent Detectors of Sense of Rotation
Learning Invariance from Transformation Sequences
What Is the Goal of Sensory Coding?
An InformationMaximization Approach to Blind Separation and Blind
Natural Gradient Works Efficiently in Learning
A Fast FixedPoint Algorithm for Independent Component Analysis
Learning Mixture Models of Spatial Coherence
Finding Minimum Entropy Codes
Factor Analysis Using DeltaRule WakeSleep Learning
Dimension Reduction by Local Principal Component Analysis
A ResourceAllocating Network for Function Interpolation
Clustering with Point and Graph
Learning to Generalize from Single Examples in the Dynamic Link
Citi izdevumi - Skatīt visu
activity algorithm analysis approach approximation assume blind cells Computation condition connections consider convergence correlations corresponding deconvolution defined depends derived described direction disparity distance distribution entropy equation estimate example experiments extraction factor Figure filter frequency function gaussian given gradient hidden units Hinton independent input invariant kurtosis layer learning linear maps matrix maximizing mean measure method minimizing natural Neural neural network neurons noise nonlinear Note object obtained optimal orientation output parameters patterns performance phase pixels position possible presented probability problem processing produce projection properties proposed receptive fields recognition reduce redundancy represent representation respect response rotation rule sensory separation shown shows signal similar simple solution space sparse spatial statistical structure surface theory tion transform units values variables variance vector visual weights
Visi Grāmatu rezultāti »
Self-organizing Map Formation: Foundations of Neural Computation
Tomaso A Poggio
Ierobežota priekšskatīšana - 2001