Главная    Ex Libris    Книги    Журналы    Статьи    Серии    Каталог    Wanted    Загрузка    ХудЛит    Справка    Поиск по индексам    Поиск    Форум   
blank
Авторизация

       
blank
Поиск по указателям

blank
blank
blank
Красота
blank
Veelenturf L.P.J. — Analysis and applications of artificial neural networks
Veelenturf L.P.J. — Analysis and applications of artificial neural networks



Обсудите книгу на научном форуме



Нашли опечатку?
Выделите ее мышкой и нажмите Ctrl+Enter


Название: Analysis and applications of artificial neural networks

Автор: Veelenturf L.P.J.

Аннотация:

Thorough, compact, and self-contained, this explanation and analysis of a broad range of neural nets is conveniently structured so that readers can first gain a quick global understanding of neural nets — without the mathematics — and can then delve into mathematical specifics as necessary. The behavior of neural nets is first explained from an intuitive perspective; the formal analysis is then presented; and the practical implications of the formal analysis are stated separately. Analyzes the behavior of the six main types of neural networks — The Binary Perceptron, The Continuous Perceptron (Multi-Layer Perceptron), The Bidirectional Memories, The Hopfield Network (Associative Neural Nets), The Self-Organizing Neural Network of Kohonen, and the new Time Sequentional Neural Network. For technically-oriented individuals working with information retrieval, pattern recognition, speech recognition, signal processing, data classification.


Язык: en

Рубрика: Computer science/Генетика, нейронные сети/

Статус предметного указателя: Готов указатель с номерами страниц

ed2k: ed2k stats

Год издания: 1995

Количество страниц: 259

Добавлена в каталог: 23.11.2005

Операции: Положить на полку | Скопировать ссылку для форума | Скопировать ID
blank
Предметный указатель
Adaptation function      184
Adaptation rule      124 179 184
Adaptive recruitment learning rule      45—49 50—63
Adjacent receptive fields      224 248
Algorithmic adaptation rule      182
Anthropomorphic pattern recognition      176—181
Approximate fitting of data      84—90
Approximate ordering error      200
Arithmetical conjunctive normal form      36
Axon      2
Back-propagation      124
Bayes classification      213—219
Bayes classifier      95
Binary perception      6—65
Boolean function      11 40—43
Characteristic function      42
Cicle time      76
Colour quantization      207—210
Complement      42
Conjugate gradient method      166
Consistency Property      17
Continuous multi layer perceptron      66—169
Convergence theorem      32—34
Cost function      34
Cover      36 44
Coverable set      52
Criterion function      34
Data set      63
Dendrites      2
Digit classification      219—221
Eeg signal analysis      237—240
Energy function      34 73
Equivalent linear threshold functions      21—24
Error back propagation      124
Error function      34 73
Exact fitting of data      81—83
Excitation      6 177
Exclusive or function      14 39
Extended input vector      14 77
Extended weight vector      14 77
Extensive set      50
Feature mapping      222
Finite sequential machine      26
Frequency detection      158
Function identification      152 230—234
Generalization      66 90—92
Generalizing with binary perceptron      48 49—53
Global learning      26 54 76
Gradient descent procedure      34 70 75
Hebb adaptation rule      179
Hebb D.O.      2 175
Hopfield J.J.      4
Hyperplane      18 20
Hyperplane boundary classifier      95 98
Hyperplane boundary classifier, double threshold labeling      106—112
Hyperplane boundary classifier, one zero labeling      96—106
Hyperplane boundary classifier, single threshold labeling      112—120
hyphenation      61
Identity function      14
Increment learning      32
Indirect arithmetical conjunctive normal form      36
Inhibition      6 178
Initialisation of continuous perceptron      166—167
Internal learning rate      101 108
Interpolation      226—229
Intersection      35
Kleene S.C.      4
Kohonen T.      4 170
Lateral synaptic weight factor      177
Learning rate      26
Learning set      66
Learning speed      165
Likelihood ratio      94
Line search method      166
Linear perceptron      72
Linear separable      18 96
Linear threshold function      13 21—24
Local learning      26 54 76
logical function      14 39
Machine condition monitoring      162
Mask function      35
Master vector      213 214 229
Master-slave decomposition      229—230
McCulloch W.S.      4
Mean squared error      73
Measurement vector      93
Metric preserving      223 247
Minsky, M.      4 9
Modelling      91
Momentum method      166
Multi-net decomposition      229—230
Nearest neighbour classification      210—212
Neighbourhood adaptation function      194
Neighbours      249—256
Neighbours, conflicting close neighbours      250
Neighbours, effective close neighbours      249
Neighbours, effective far neighbours      250
Neighbours, effective neighbours      250
Neighbours, geometric neighbours      249
neuron      2
Observation mapping      222
Observation vector      158 177
Observation window      158 172
Observation window, fields      176
Observation window, pixel value      176
Observation window, resolution levels      176
One zero behavior      3
Order      35
Ordering error      199 222
Ordering of weight vectors      193—203
Ordering phase      196 222
Over fitting      133
Papert S.      4 9
Parity problem      16
Pattern set      35
Perceptron      10
Pitts P.      4
Pixel function      35
Predicate logic      8
Projection mapping      222
Proper adaptive recruitment learning rule      49
Quantization mapping      222
Quantization noise      187 246
Quantization phase      187 222
Receptive fields      179 185 203
Recruitment and reinforcement rule      53—57 63—64
Reinforcement rule      24—34
Restricted topology preservation      226
Retinotopy      171 174
Ritter H.      182
Robot arm control      234—237
Roosenblatt F.      4 9
Scaling training vectors      245—246
Selecting training vectors      245—246
Self organizing neural network      170—255
Set step function      50
Sigmoid function      66
Signature identification      252—255
Single neuron perceptron      76—122
Slave vector      214 229
Solution space      29—31
Solution vector      29
Soma      2
Speech recognition      240—245
Staged training      76
State diagram      26
State transition function      26
Step function      13
Substratum      36
Switching circuits      11 58
Synaps      2
Test set      67
Three layer continuous perceptron      145—152
Threshold      12
Topological energy      247
Topological noise      247
Topology preservation      222—226
Training set      245
Transmission efficiency      2 6
Traveling salesman problem      203—207
Two layer continuous perceptron      122—133
Under fitting      133—136
union      35
Vector quantization      184—192
Veelenturf L.P.J.      5
Visual acuity      171
Weighed output error      124
Weight      7
Weight vector      13
Well ordering      196
blank
Реклама
blank
blank
HR
@Mail.ru
       © Электронная библиотека попечительского совета мехмата МГУ, 2004-2024
Электронная библиотека мехмата МГУ | Valid HTML 4.01! | Valid CSS! О проекте