|
 |
Àâòîðèçàöèÿ |
|
 |
Ïîèñê ïî óêàçàòåëÿì |
|
 |
|
 |
|
 |
 |
|
 |
|
Wallace C.S. — Statistical and Inductive Inference by Minimum Message Length |
|
 |
Ïðåäìåòíûé óêàçàòåëü |
Induction, human 394 396
Induction, MML, practice of 391—399
Induction, of the past 356—365
Induction, of the past, by maximum likelihood 357—358
Induction, of the past, by MML 358—361
Induction, of the past, with deterministic laws 363—365
Inductive conclusions 6
Inductive inference 1—55
Inductive inference, introduction to 5—11
Inductive inference, primitive 387—388
Inductive process 6
Inexplicable present views 362—363
Inference problem, specified 153
Inference(s), Bayesian see “Bayesian inference”
Inference(s), coding of 148—150
Inference(s), inductive see “Inductive inference”
Inference(s), Non-Bayesian 28—35
Inference(s), of regular grammars 305—3t4
Inference(s), possible, set of 147—148
Inference(s), statistical see “Statistical inference”
Infinite entropy distributions 94
Infinite sets, codes for 91—92
Infinite sets, feasible codes for 96 98
information 57—141
Information content of multinomial distribution 81—87
Information matrix, empirical 241
Information, communication of 57—58
Information, defined 57
Information, Fisher see “Fisher Information”
Information, measurement, of 66—68
Information, pattern and noise, separation of 190—191
Information, Shannon see “Shannon information”
Information, subjective nature of 79—81
Instructions, Turing machines 101
Integers, optimal codes for 93—96
Invariance, data representation 187—188
Invariance, model representation 188—189
Invariant conjugate priors 52—53 261
Invariants 340
Irregular likelihood functions 242—243
ISO-7, code 61
J-factor model 300—303
Jeffreys prior 48—49 410 412
Joreskog. K.G. 299
Kearns et al. MDL criterion (KMDL) 322 325
Kent. J.T. 300
Kepler's laws 13
KLD (Kullback Leibler distance) 204—205 287—288
KMDL (Kearns et al. MDL criterion) 322 325
Knowledge 5—6
Kolmogorov, A.N. 4 57 102 275 401
Korb. K.B. 323—326 335
Kraft's inequality 65
Kuhn, T. 385
Kullback — Leibler distance (KLD) 204—205 287—288
Langdon, G.G. 73
Large-D message length, MML 237
Latent factor model 297—303
Latent factor model, defining equations for 299
Latent factor model, MML 300—303
Lattice constant 180
Lauguage(s), choice of 132
Lauguage(s), natural see “Natural languages”
Lauguage(s), scientific 391
leaf nodes 315
Lee, T.C.M. 319
Lengths of explanations 143
Leung Yan Cheong, S.K. 98
Levin search 396—397
Levin, L.A. 392 396 397
Likelihood 30
Likelihood functions, irregular 242—243
Likelihood principle, minimum Message Length approach and 254 255
Likelihood, maximum see “Maximum likelihood”
Linear regression 270—272
Log likelihood ratio 33 35
Log* code 99 100 409
Loss function 347—348
Loss function, expected value of 189
Machine learning 2
Mansour. Y. 321—325
Mardia. K.V. 300
Marginal maximum likelihood 203—204
Marginal probability 154
Maximum entropy density 90—91
Maximum entropy priors 51—52
Maximum likelihood (ML) method 30
Maximum likelihood estimator 299—300
Maximum likelihood, for Neyman Scott problem 203—204
Maximum likelihood, induction of the past by 357 358
Maximum likelihood, marginal 203—204
Maximum likelihood, normalized (NML) 410—415
Maxwell — Boltzmann statistics 378
MDL (Minimum Description Length) criterion 321—325
MDL (Minimum Description Length) principle 401 408—415
Mealey machine representation 305—307
Mean 27
Mean, discrimination of 193—195
Mean, Normal, with Normal prior, estimation of 173—177
Mean, of multivariate Normal 177—183
Mean, of uniform distribution of known range 183 187
Mean, sample 260
Measurement of information 66—68
Meek, C 327
MEKL see “Minimum expected K-L distance”
Memories 355 356
Message 59
Message format for mixtures 279—280
Message length formulae, MML 235
Message length, average 234
Message length, curved-prior, MML 236—237
Message length, Dowe's approximation to 209—213
Message length, large-D, MML 237
Message length, small-sample, MML 235—236
Message section 328
Metropolis algorithm 332 333
Micro-state 88
Ming Li 392
Minimal sufficient statistics 163
Minimum Description Length (MDL) criterion 321—325
Minimum Description Length (MDL) Principle 401 408—415
Minimum expected K-L distance (MEKL) 205—208
Minimum expected K-L distance (MEKL), for Neyman — Scott problem 206—208
Minimum Message Length (MML), approach 117—118
Minimum Message Length (MML), as descriptive theory 385—399
Minimum Message Length (MML), binomial example 246—248
Minimum Message Length (MML), coding scheme 222—228
Minimum Message Length (MML), curved-prior message length 236—237
Minimum Message Length (MML), details in specific cases 257—303
Minimum Message Length (MML), efficiency of 230—231
Minimum Message Length (MML), extension to Neyman — Scott problem 252—253
Minimum Message Length (MML), induction of the past by 358—361
Minimum Message Length (MML), large-D message length 237
Minimum Message Length (MML), likelihood principle and 254—255
Minimum Message Length (MML), limitations of 249—250
Minimum Message Length (MML), message length formulae 235
Minimum Message Length (MML), model invariance 229—230
Minimum Message Length (MML), multi-parameter properties of 234—235
Minimum Message Length (MML), multiple latent factors 300—303
Minimum Message Length (MML), multiple parameters in 232—233
Minimum Message Length (MML), negative binomial distribution 253
Minimum Message Length (MML), normal distribution 250—253
Minimum Message Length (MML), practice of induction 391—399
Minimum Message Length (MML), precision of estimate spacing 238—240
Minimum Message Length (MML), properties of estimator 228—240
Minimum Message Length (MML), quadratic approximations to SMML 221—255
Minimum Message Length (MML), quadratic, assumptions of 226—227
Minimum Message Length (MML), related work and 401—415
Minimum Message Length (MML), singularities in priors 237
Minimum Message Length (MML), small-sample message length 235—236
| Minimum Message Length (MML), standard formulae 235
Minimum Variance Unbiased estimators 31—32
Minimum-cost estimation 189
Mixture models 275—297
Mixtures, Fisher Information for 290—291
Mixtures, message format for 279—280
ML (Maximum Likelihood) method 30
ML* estimator 299—300
MML see “Minimum Message Length approach”
Model 28
model classes 408
Model density 177
Model family 28
Model invariance, MML 229—230
Model representation invariance 188—189
Multi-word messages, coding 72—73
Multinomial distributions 247
Multinomial distributions, information content of 81—87
Multinomial distributions, irregularities in 248
Multivariate mean estimator, summary of 183
Multivariate Normal distribution, conjugate priors for 261—264
Multivariate Normal, mean of 177—183
NAT 78
Natural languages, efficiencies of 389—390
Natural languages, hypotheses of 388—390
Natural languages, inefficiencies of 390
Neal, R. 73
Negative binomial distribution 253
Nested union 192
Neyman — Pearson testing 413
Neyman — Scott problem 200—204 218
Neyman — Scott problem, ideal group estimator for 201—202
Neyman — Scott problem, maximum likelihood for 203—204
Neyman — Scott problem, minimum expected K-L distance for 206—208
Neyman — Scott problem, MML extension to 252—253
Neyman — Scott problem, other estimators for 202—203
Ng, A.Y. 321—325
Nit 78
NML (normalized maximum likelihood) 410—415
nodes 61
nodes, children 64
Nodes, dummy 318—319
Noise 190
noise level 273
Non-adiabatic experiment 382—384
Non-Bayesian estimation 30—32
Non-Bayesian inference 28—35
Non-Bayesian model selection 32—35
Non-binary codes 77—78
Non-deterministic laws 344
Non-deterministic laws, deduction with 348—350
Non-random strings 109
Normal density 28
Normal distribution function 29
Normal distribution, conjugate priors for 258—264
Normal distribution, multivariate, conjugate priors for 261—264
Normal distribution, with coarse data 265—266
Normal distribution, with perturbed data 264—265
Normal mean, with Normal prior, estimation of 173—177
Normal prior, estimation of Normal mean with 173—177
Normal, multivariate, mean of 177—183
Normalization 407—408
Normalized maximum likelihood (NML) 410—415
Normalized probability measure 403
Notation, confusion of 144—145
Noun 389
Nuisance parameters 285
Null hypothesis 19 32—33
Observations 8
Occam's razor 1
Oliver, J.J. 268 319
Optimal codes 63—66
Optimal codes, construction of 69—72
Optimal codes, for integers 93—96
Optimal codes, properties of 76—77
Optimum quantizing lattice 178 285
Order 337
Organisms 7 8
Output lape, Turing machines 101
Parameter values 150
Parameters, discrete, imprecise assertion of 284—286
Partial order equivalence 330
Particle-state probability distribution 342
Partitions of hypothesis space 213—215
PAST 368 369
Past disorder, deduction of 345—355
Past, induction of the see “Induction of the past”
Past, of a computer process 375
Past, regained 369—370
Past, simulation of the 381—382
Patrick, J.D. 275 314—315
Pattern 190
Perturbed data, normal distribution with 264—265
PFSM see “Probabilistic finite-state machine”
Philosophy of science 1—2
Pipe model 356
Poisson distribution 269—270
Popper. K. 10
Population 150
Possible data, set X of 144—145
Possible inferences, set of 147—148
Posterior 36
Posterior density 37
Posterior distribution 36
Posterior probability 36
Precision approach 216
Precision of estimate spacing 238—240
Predictive distribution 206
Prefix code 01
prefix property 60
PRESENT 368 369
Present views, inexplicable 362—363
Previous likelihood 46
Price, H. 338—339
Primitive inductive inference 387—388
Primitive universal Turing machines (UTM0) 135—140
Prior density 37
Prior distribution 35
Prior estimate probability 270
Prior premises 17
Prior probability 35
Prior probability density 37 150—151
Prior probability density, coding probability arid 222
Priors 35
Priors, alternative 350—353
Priors, conjugate see “Conjugate priors”
Priors, evolution of 135—141
Priors, invariant conjugate 52—53
Priors, maximum entropy 51—52
Priors, Normal, Normal mean with, estimation of 173—177
Priors, origins of 45—54 133—135
Priors, singularities in 237
Priors, theory description codes as 114 115
Priors, Uniform 182
Priors, uninformative 49—51
Priors, Universal Turing machines as 124—130
Probabilistic finite-state machine (PFSM) 307 314
Probabilistic finite-state machine (PFSM), assertion code for 308—309
Probabilistic model of data 146
Probabilistic regular grammar 308
probability 21—22
Probability density function 25
Probability distributions defined 146
Probability distributions defined, particle-state 342
Probability distributions defined, Turing 103—104
Probability line 74
Probability, coding 149—150
Probability, conditional 21
Punctuated binary codes 92—93
Quantization noise 181
|
|
 |
Ðåêëàìà |
 |
|
|