Нашли опечатку? Выделите ее мышкой и нажмите Ctrl+Enter
Название: On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks
Автор: Stephen Grossberg
Аннотация:
Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on any finite number of cells without causing irrevocable sampling bias ifA = orA =. Total energy transfer from inputs ofA to outputs of depends on the entropy of the input distribution. Pattern completion on recall trials can occur without destroying perfect memory even ifA = by choosing the signal thresholds sufficiently large. The mathematical results are global limit and oscillation theorems for a class of nonlinear functional-differential systems.