|
|
Авторизация |
|
|
Поиск по указателям |
|
|
|
|
|
|
|
|
|
|
Meisel W.S. — Computer-oriented approach to pattern recognition |
|
|
Предметный указатель |
functions 77 92ff 116
A priori probabilities 24 38 44
Abend, K. 15
Absolute error correction algorithm 68
Agrawala, A.K. 33 44 156
Aizerman, M.A. 98
Anticohcsion 154
Approximate risk 55 64ff
Approximation, efficient 91 230—246
Approximation, integraI-square 85ff
Approximation, least-mean-square 88ff
Approximation, weighted mean-square 91 ff
artificial intelligence 3
Asymptotic convergence 219—220 225
Bashkirov, O.A. 98
Bellman, R.E. 13
Bhattacharyya distance 183 186
binary variables 79 124 214—215 223
Binomia! coefficient 35 78
Bonner, R.E. 147
Braverman, E.M. 98
Butler, G.A 145 1S4
Center adjustment algorithm 146
Center-distance representation 140—141 142 147
Center-variance adjustment algorithm 146
Chandrasekaran, B. 15 23
Character recognition 214
Chten, Y.l 215
Cluster analysis 138ff
Cluster seeking 138
Co variance matrix 42
Cohesion. 154
Collins, D.C 218
Composed Functions 178
Concave function 122
Confusion region 133
Convexity 60
Cost function 59ff
Cover, T.M. 219
Curse of dimensionality 12 13 42 88 186
Data analysis 157
Data reduction 3
Decision boundaries 9 16ff
Decision rule 8 9 17ff 23
Decomposition of probability density functions 138
Decomposition, principle of 166ff
Design sets 22—24
Dimensionality 9
Direct methods 28
Discriminant analysis 163 183 202
Discriminant functions 16ff 18
Distance measures, city-block 145
Distance measures, Euclidean 9 11—12 123
Distortion-invariant feature extraction 218
Divergence 183 186
Dot product 48 56
Dynamic programming 164 203 216
Edge-weighted linear graph 151
Edic, J. 146
Editing samples 116
Electrocardiogram 193
Error correction method 123
Estimation, unsupervised 133
Factor analysis 163 196—197
False alarm 40
Feasibility tests 116
Feature ranking 163—164 199—202
Feature selection 7 24 27—28 162ff
features 7
Fischer, F.P. 11 172 188
Fisher discriminant 183
Fisher, R.A. 184
Fixed-increment algorithm 68
Foley, D. 13 14 15
Fractional correction algorithm 68 70
Friedman, H.P. 146
Fu, K.S. 134 201 23
Fukunaga, K. 146 197
Fuzzy sets 21
Gaussian distribution see Normal distribution
Global maximum 50
Gose, E.L. 107 108 148 203
Gradient 47
Gradient technique 48—50 75
Gram — Schmidt orthanormalization 86 198
Graph-theoretic methods 148—152
Grinoid, R.C. 76
Groner, G.F. 55
Guard zone 147
Hart. P.E. 219
Henrichon, E.C, Jr. 134 281
Heuristic analysis 216—219
Heydorn. R.P. 184
Highleyman, W.H. 23
Hill-climbing 49
histogram 41—42 103 214
Ho, Y.-C. 33 44
Hoffman, R.L. 133
Hughes, G.-F. 15 24
Hyperpiane 56
Image processing 163 214
Implicit subclasses 121
Independence of features 214
Indirect methods 28
Indirect methods in clustering 152
Indirect methods in optimization 47ff
Inner product see Dot product
Interset distances 164 179—183
Intraset distances 179—183
Intrinsic dimensionality 15 162
Iterative adjustment of clusters 145—146
k-nearest neighbor 29 30
Kanal, L.N. 15 23
Karhunen — Loeve expansion 164 196—197 202
Koford, J.S. 55
Koontz, W.L.G. 146 197
Lagrange multipliers 182
Layered networks 125 124—128 169
Learning set 22
Learning with probabilistic teacher 138 156
Learning without teacher 138
Learning, decision-directed 138 [55—156
Learning, unsupcrvised 138ff 155—156
Levy, D.M. 204
Likelihood 40
| Likelihood ratio 215
Linear discriminant functions 58 92ff
Linear programming 50 70 76 131
Linear transformations 169ff
Linguistic analysis 216—219 224
Local maximum 50
Local minimum 60
Localized distance 164 180—181 183—189
Loss function 39ff 56 59ff
Mangasarian, O.L. 131 133
Many-at-a-time algorithm 65 68 69 75
Markovian dependence 214
Martin. L.W. 35
Mcisel, W.S. 22 95 134 174 184
Mean error 55
Measurement selection 3
Measurement space 7—8
Michalopoulos, D.A. 134
Minimal spanning trees 151
Minimax criterion 44
Mixture distribution 143
Moc, M.L. 133
MODE 143
Mode seeking 144—145
Modified SPRT 215
Mucciardi, A.N. 107 108 148 203
MuJtimoda] distribution 15
Multiple criteria 166
Multivariale splines see Piecewise linear function Piecewise
Mutual information 184
Nearest-neighbor technique 29—30 133
Nelson, G.D. 204
neuron 79
Neyman — Pears on criterion 44
Nonconcave function 122
Noninferior solutions 166
Nonlinear mapping 191 206
Nonparametric; methods 28—29
Nonstationary pattern recognition 220—222 225
Norm, Euclidean 56 105
Normal distribution 42—43
Normalization 9 10 106 107 169—170
Nuisance samples 117
Numerical taxonomy 157
One-at-a-time algorithm 65 68 75
Orthonormaiity 86—87
Owen, J. 134
Parameterized transformations 169ff
Parametric methods 28—29 38ff
Parzcn window 98
Parzen estimator 91 98
Parzen, E. 98 103 219
Patrick, E.A. 172 174 184 188 218
Pattern classes 9
Pattern classification 8
Pattern space 7 9
Pattern space, augmented 63—64
Pavlidis, T. 217
Perceptrons 124—128 169
Piecewise linear clusters 140
Piecewise linear decision boundary 114 128
Piecewise linear discriminant function 114 120ff
Piecewise linear functions 116 218
Piecewise linear potential function 102
Polynomials, orthonormal 228—229
Potential functions 98ff
Potential functions, generalized 114—116
Potential functions, polynomial 108—101 14
Potential functions, type 1 106—107
Potential functions, type 2 106—107
Prediction 5
Preprocessing 7
Principal components analysis 197
Probability densities 21—22 85ff
Prototype classification 30—32 123
Proximal surfaces 141
Radar 168
Random search 51—53
Ranking formulas 201 202 216
Ringwald, B.E. 157
Risk 39ff 44 45
Rubin, J. 146
Sample space, augmented 57—58
Samples 2 8
Samples, labeled 2 8 9 33
Samples, sheets of 16
Samples, subsets 141—142
Samples, unlabeled 33 138 155
Scbestyen, G.S. 105—146
Schwarz inequality 48
Sequential feature selection 215—216 224
Sequential probability ratio test 215
Similarity matrix 149—150
Size and shape parameters 107—108
Skew 109
Smith, F.W. 76 94
Spanning tree 151
Specht, D.F. 109 110 It4
Speech processing 214
Statistical decision theory 39ff
Statistical formulation 33ff
Stochastic approximation 50 87 219—220 225
Structural analysis 216—219 224
Structure-preserving transformations 190—191 206
Subset generation 138
Successive dichotomy 18 20 87
Test sets 22—24
Threshold elements 78ff 124 127 215
Threshold elements, variable-threshold 80
Tou, J.T. 184
Training set 22
Transformations, custom 193—199
Transformations, linear 163
Transformations, nonlinear 163
Transformations, orthogonal 170—172
Transformations, orthonormal 163
Transformations, piecewise linear 163 174—178
Tree structure 18 51
Tree structure, weight of 151
Tsypkin, Ya.Z. 98
Vector, augmented 57
Wald, A. 215
Watanbe, M.S. 154 217
Weight space 58
Weight vectors 57
Zahn, C.T. 151 152
|
|
|
Реклама |
|
|
|