Авторизация
Поиск по указателям
Kecman V. — Learning and soft computing. Support vector machines, neural networks, and fuzzy logic models
Обсудите книгу на научном форуме
Нашли опечатку? Выделите ее мышкой и нажмите Ctrl+Enter
Название: Learning and soft computing. Support vector machines, neural networks, and fuzzy logic models
Автор: Kecman V.
Аннотация: This textbook provides a thorough introduction to the field of learning from experimental data and soft computing. Support vector machines (SVM) and neural networks (NN) are the mathematical structures, or models, that underlie learning, while fuzzy logic systems (FLS) enable us to embed structured human knowledge into workable algorithms. The book assumes that it is not only useful, but necessary, to treat SVM, NN, and FLS as parts of a connected whole. Throughout, the theory and algorithms are illustrated by practical examples, as well as by problem sets and simulated experiments. This approach enables the reader to develop SVM, NN, and FLS in addition to understanding them. The book also presents three case studies: on NN-based control, financial timeseries analysis, and computer graphics. A solutions manual and all of the MATLAB programs needed for the simulated experiments are available.
Язык:
Рубрика: Computer science /AI, knowledge /
Статус предметного указателя: Готов указатель с номерами страниц
ed2k: ed2k stats
Год издания: 2001
Количество страниц: 541
Добавлена в каталог: 16.05.2005
Операции: Положить на полку |
Скопировать ссылку для форума | Скопировать ID
Предметный указатель
-insensitivity zone 177
ABC adaptive backthrough control 429—449
ABC adaptive backthrough control, ABC of time-variant plant 440—443
ABC adaptive backthrough control, back propagation through a plant 427—428
Activation function 15—17 259 275—290
ADALINE 213
Additive noise 124
Animation 470—474
Approximating function 126
Approximation 29 34—41
Approximation error 134 136
Asymptotic consistency 131
Attribute See membership function
Back propagation through a plant 427—428
Bayes decision criterion 71 78
Bayes risk 71 86
Bayesian classification 77—81
Best approximation 29
BFGS optimization method 488
Bias (offset, threshold) in NN 15 150 157 181 182 196 206 289
Bias-variance 40 136 268—274
Binary classification 71
Bipolar sigmoidal function 259
Canonical hyperplane 152
Classification 68 149 162 166
Classification, binary 71
Classification, dichotomization 91
Classifiers, parametric 92
Classifiers, template matching 101
Composition in FL 380—382
Computer graphics 463—480
Conjugate gradient method 430 489—494
Consistent estimators 275
Covariance matrix 93 334 341 529
Crafting sigmoidal AF (learning) 280—283
Cross-validation 40 137 269 272
Davidon — Fletcher — Powell method 487
Decision boundary 151
Decision regions 70 88
Defuzzification methods 393
Defuzzification methods, center-of-area 393
Defuzzification methods, center-of-gravity 393
Defuzzification methods, first-of-maxima 393
Defuzzification methods, middle-of-maxima 393
Degree of belonging 372 376
Delta signal, -signal 234 257
Design matrix 35
Dichotomization 91
Discriminant function 89
Discriminant function for normally distributed classes 93—95
Distal teacher 426 428
EBP error back propagation 255—266
Empirical risk minimization ERM 130
Epoch 6 208 230
Equality of NNs arid FLMs 396
Error correction learning 194 204 234 236
Error signal term (\breve{S} signal) 234 257
Error stopping function 292
Error surface 44—53 302 484
Estimation error 135
Evolutionary computing 496—504
Facial animation 473
FAM, fuzzy additive model 404—410
Financial time series 449—463
Fletcher — Powell method 487
Fletcher — Reeves CG method 492
Fourier series and NN 47
Fuzzy logic systems, composition 380—382
Fuzzy logic systems, defuzzification 391—394
Fuzzy logic systems, defuzzification, center-of-area 393
Fuzzy logic systems, defuzzification, center-of-gravity 393
Fuzzy logic systems, defuzzification, first-of-maxima 393
Fuzzy logic systems, defuzzification, middle-of-maxima 393
Fuzzy logic systems, degree of belonging 372 376
Fuzzy logic systems, design steps for FL models 405
Fuzzy logic systems, fuzzification 385 391
Fuzzy logic systems, fuzzy additive models (FAM) 386 404—410
Fuzzy logic systems, IF-THEN rules 378
Fuzzy logic systems, implication 383—385
Fuzzy logic systems, inference 382—391
Fuzzy logic systems, membership function 21—24 367—371
Fuzzy logic systems, normal f. sets 368
Fuzzy logic systems, not-normal f. sets 368
Fuzzy logic systems, possibility degree 376
Fuzzy logic systems, relational matrix 376—382
Fuzzy logic systems, relations 374
Fuzzy logic systems, rule explosion 408
Fuzzy logic systems, S-norm 373
Fuzzy logic systems, set operations 371
Fuzzy logic systems, sets 367
Fuzzy logic systems, surface of knowledge 394—396
Fuzzy logic systems, T-norm 373
Fuzzy logic systems, trapezoidal membership function 371
Fuzzy logic systems, triangular membership function 371
Gauss — Newton method 495
Generalization error 134
Generalization of NNs and SVMs 40 269
Generalized delta ( ) rule 260 263
Generalized least squares 495
Genetic algorithms 496—504
Geometry of learning 277—288
Gradient method 49 54—60 230—237 301—302 518
Gramm — Schmidt orthogonalization 348
Graphics by RBF networks 463—480
Green's function 320
Growth function 144
Hessian matrix 57 229 296 301 485 495
Human animation 470—474
Hypothesis space 134
Ideal control 421
IF-THEN rules 378
Ill-posed problem 202 314
Indicator function 138 150
Insensitivity or s zone 177
Interpolation 34—41
Jacobian 428—430
Karash — Kuhn — Tucker condition 156
kernels 170
Key learning theorem 131
Kolmogorov theorem 13
Lagrangian, dual 156 163 172 180
Lagrangian, primal 156 163 172 180
Learning 61
Learning fuzzy rules (LFR) 396
Learning machine 126
Learning, 1. by subset selection 146 334 353
Learning, 1. of linear neuron weights (5 methods) 225
Learning, 1. rate 194 296
Learning, momentum term 296—301
Learning, moving center learning 337
Levenberg — Marquardt method 495
Likelihood ratio 78
Linear dynamic system 223
Linear neuron 213
Linear programming (LP) 353—358
Linear separability 202
LMS learning algorithm 234
Logistic (unipolar sigmoidal) function 259
Loss function 81 84 126
Lp norms 28—31 512
Mahalanobis distance 94 100
MAP maximal-a-posteriori decision criterion 71
Margin 153
Matrix-inversion lemma 237 239
Maximal margin classifier 149
Maximal-a-priori decision criterion 71
Membership function 21—24 367—371
Mercer kernels 170
MLP multilayer perceptron 15—18 26 255
Momentum term 296—301
Morphing 466—470
Multiclass classification 80
NARMAX model 422 433 451
Nested set of functions 114
Newton — Raphson method 229 301—302 485
NNs based control 421
NNs based control, ABC of time-variant plant 440—443
NNs based control, adaptive backthrough control ABC 429—449
NNs based control, backpropagation through a plant 427—428
NNs based control, dead-beat controller 433
NNs based control, direct inverse modeling 423
NNs based control, distal teacher 426 428
NNs based control, errors, definition of 431
NNs based control, errors, definition of, controller error 431
NNs based control, errors, definition of, performance error 431
NNs based control, errors, definition of, predicted performance error 431
NNs based control, errors, definition of, prediction error 431
NNs based control, general learning architecture 423
NNs based control, ideal linear controller 421
NNs based control, IMC internal model control 431
NNs based control, indirect learning architecture 425
NNs based control, Jacobian of the plant 428—430
NNs based control, parallel model 422
NNs based control, series-parallel model 422
NNs based control, specialized learning architecture 425
Noise influence on estimation 220 224
Nonradial BFs 337 339
Norm 28—31 512
Normal equation 228 344
OLS orthogonal least squares 343
Orthogonalization 350—352
Overfitting 41 269
Parametric classifier 92
Penalty parameter C 163
Perceptron 194
Perceptron, convergence of the p. learning rule 199
Perceptron, p. learning algorithms 204
Polak — Ribiere CG method 493
Possibility degree 376
Powell's quadratic approximation 58—61
Projection matrix 348
Quadratic programming 156—158 163—165 172—173 180—181
Quasi — Newton methods 486
Radial basis functions (RBFs) network 15—18 26 33—41 313—358 463 478
Regression 62—68 176 354—357 515
Regularization 314
Regularization parameter 137 320 329
Reproducing kernels 170
Ridge regression 137
Risk 85
RLS recursive-least-squares 237—241
Rule explosion 408
Second order optimization methods 483—496
Share market 450
Sigmoidal functions, bipolar s. f. 259
Sigmoidal functions, logistic (unipolar) function 259
Similarity between RBFs and FLMs 395—404
Soft margin 162
SRM, structural risk minimization 145 161
Stabilizer (in RBFs network) 320 329
Subset selection 146 334 353
Support vector 157
Support vector machines, SVMs 148
Support vector machines, SVMs, for classification 149 162 166
Support vector machines, SVMs, for regression 176
Surface of knowledge 394—396
System of linear equations 505
Underfitting 269
Uniform convergence 131
Universal approximation 36—37
Universe of discourse 367
Variable metric method 486
Variance 134—136
VC dimension 138
Vectors and matrices 510—514
Weight decay 137
Weights, geometrical meaning of weights 14 16 280—283
Weights, initialization 290
Реклама