Àâòîðèçàöèÿ
Ïîèñê ïî óêàçàòåëÿì
Larsen R.J., Marx M.L. — Introduction to Mathematical Statistics and Its Applications, An (4th Edition)
Îáñóäèòå êíèãó íà íàó÷íîì ôîðóìå
Íàøëè îïå÷àòêó? Âûäåëèòå åå ìûøêîé è íàæìèòå Ctrl+Enter
Íàçâàíèå: Introduction to Mathematical Statistics and Its Applications, An (4th Edition)
Àâòîðû: Larsen R.J., Marx M.L.
Àííîòàöèÿ: I am surprised by the number of negative reviews for what I consider to be a nicely written, well thought out, and logically presented introductory course on mathematical statistics. Yes, a working knowledge of elementary calculus is a prerequisite. But the mathematics invoked in the exposition of concepts and theorems are kept as simple as possible while maintaining that modest level of rigor appropriate for a introductory exposition. If you do not have the minimal mathematical prerequisites (such as freshman calculus), blame your instructor or your school for selecting an inappropriate text. But don't blame the authors! I thought the examples and problems were appropriate in their level of difficulty (mostly not so hard) and the relation to the material just covered. There are plenty of poorly written, impossibly dry, inpenetrable texts on statistics out there - this is not one of them. In addition, the book is attractively packaged, the paper quality is excellent, the visuals are informative and clearly presented - that also should not be taken for granted. Lastly the authors have a wicked entertaining sense of humor that spice the presentation throughout. I consider this book to be a welcome addition to the set of modern textbooks available to the curious serious student of probability and statistics.
ßçûê:
Ðóáðèêà: Ìàòåìàòèêà /
Ñòàòóñ ïðåäìåòíîãî óêàçàòåëÿ: Ãîòîâ óêàçàòåëü ñ íîìåðàìè ñòðàíèö
ed2k: ed2k stats
Èçäàíèå: 4
Ãîä èçäàíèÿ: 2005
Êîëè÷åñòâî ñòðàíèö: 928
Äîáàâëåíà â êàòàëîã: 02.10.2015
Îïåðàöèè: Ïîëîæèòü íà ïîëêó |
Ñêîïèðîâàòü ññûëêó äëÿ ôîðóìà | Ñêîïèðîâàòü ID
Ïðåäìåòíûé óêàçàòåëü
Alternative hypothesis 428 434—435
ANOVA table 739—740 763 111
Arc sine transformation 759—760
Asymptotically unbiased 388 406
Bayes theorem 62—63 79—81 410—411
Bayesian estimation 410—422
Behrens — Fisher problem 555 567
Benford's law 152—153 609—611
Bernoulli distribution 229 235 344—346 394—395
Bernoulli trials 229
Bertillon configuration 87
Best estimator 395
Beta distribution 413
Bills of mortality 8—10
Binomial coefficients 108 110—113
Binomial distribution, additive property 221—222
Binomial distribution, arc sine transformation 759—760
Binomial distribution, confidence interval for p 369—371 587—588
Binomial distribution, definition 131 155
Binomial distribution, estimate for p 344—346 380—381 394—395
Binomial distribution, hypothesis tests for p 440 443—445 578—580
Binomial distribution, in sign test 804
Binomial distribution, moment-generating function 258—259
Binomial distribution, Moments 176 229 235 264—265
Binomial distribution, normal approximation 292—293 297—299 338—339
Binomial distribution, Poisson approximation 276—277
Binomial distribution, relationship to Bernoulli distribution 229 235
Binomial distribution, relationship to beta distribution 415
Binomial distribution, relationship to hypergeometric distribution 138—139
Binomial distribution, relationship to multinomial distribution 602—603
Binomial distribution, sample size determination 373—374
Birthday problem 117—119
Bivariate distribution see "Joint probability density function"
Bivariate normal distribution 719—723
BLOCKS 524 773—774 788—789 792—793 796—800 832—833
categorical data 535—537 627—637
Central limit theorem 292—294 302—307
Chebyshev’s inequality 408—409
Chi square distribution, additive property 330
Chi square distribution, definition 474
Chi square distribution, formula for approximating percentiles 506—507
Chi square distribution, moment-generating function 332
Chi square distribution, moments 330
Chi square distribution, noncentral 767—769
Chi square distribution, relationship to F distribution 475
Chi square distribution, relationship to gamma distribution 474
Chi square distribution, relationship to normal distribution 474
Chi square distribution, relationship to Student t distribution 476
Chi square distribution, table 500—501 856—857
Chi square test for goodness-of-fit 599 606—607 616 642—644
Chi square test for independence 631
Chi square test for the variance 504 516—519
Chi square test in nonparametric analyses 827 832
Combinations 107
Complement 30
Completely randomized one-factor design comparison, with randomized block design 780
Completely randomized one-factor design, comparison with Kruskal — Wallis test 841—846
Completely randomized one-factor design, computing formulas 742
Completely randomized one-factor design, error sum of squares 737—738
Completely randomized one-factor design, notation 733—734
Completely randomized one-factor design, relationship to two-sample data 745—746
Completely randomized one-factor design, test statistic 738—739 769
Completely randomized one-factor design, total sum of squares 737—738
Completely randomized one-factor design, treatment sum of squares 735—738 754 766—767
Conditional expectation 677—679
Conditional probability definition 43—44 250
Conditional probability in bivariate distribution 249—256
Conditional probability in higher-order interactions 53—54
Conditional probability in partitioned sample spaces 56—57 62—63 410—411
Conditional probability in regression 677—679
Confidence band 695
Confidence coefficient 368—369
Confidence interval see also "Prediction interval"
Confidence interval, definition 363—364
Confidence interval, for conditional mean in linear model 694
Confidence interval, for difference of two means 582
Confidence interval, for difference of two proportions 587
Confidence interval, for mean of normal distribution 364—369 481—482
Confidence interval, for p in binomial distribution 369—371
Confidence interval, for quotient of two variances 585
Confidence interval, for regression coefficients 688 690—691
Confidence interval, for variance of normal distribution 501
Confidence interval, interpretation 365—367 423—424
Confidence interval, relationship to hypothesis testing 585
Consistent estimator 406—409
Consumer’s risk 459
Contingency table 536 628 635
Continuity correction 296—297
contrast 751—756 782—784
Correlation coefficient, applied to linear relationships 707
Correlation coefficient, definition 707
Correlation coefficient, estimate 708—709
Correlation coefficient, in bivariate normal distribution 719—723
Correlation coefficient, interpretation 710 719
Correlation coefficient, relationship to covariance 707
Correlation coefficient, relationship to independence 704
Covariance 702—705 707
Cramer — Rao lower bound 394—397 404—405
Critical region 433
Critical value 433
Cumulative distribution function (cdf), definition 159 170 213
Cumulative distribution function (cdf), in pdf of order statistics 244 247
Cumulative distribution function (cdf), relationship to pdf 170 214
Curve-fitting, examples 648—655 663—671
Curve-fitting, method of least squares 647—648
Curve-fitting, residual 650
Curve-fitting, residual plot 650—655
Curve-fitting, transformations to induce linearity 662—663 666 668—669 671
Data transformations 758—760
De Moivre — Laplace limit theorem 292—293 301—302
de Morgan’s laws 35
Density function see "Probability density function (pdf)"
Density-scaled histogram 165—168 291 339—340 359—362
Dependent samples 524—525 773—774 792—793 796—800
Distribution-free statistics see "Nonparametric statistics"
Efficiency 388—393 396
Efficient estimator 395—396
Estimation see also "Confidence interval"
Estimation, Bayesian 410—422
Estimation, least squares 647—648
Estimation, maximum likelihood 344—354
Estimation, method of moments 357—362
Estimation, point versus interval 363—364
Estimator see also "Confidence interval"
Estimator, best 396
Estimator, consistent 408—409
Estimator, Cramer — Rao lower bound 394
Estimator, difference between estimate and estimator 346 349
Estimator, efficient 396
Estimator, for binomial p 344—346 380—381 394—395
Estimator, for bivariate normal parameters 721
Estimator, for contrast 752—753
Estimator, for correlation coefficient 708—709
Estimator, for exponential parameter 351 385—386
Estimator, for gamma parameters 359—362
Estimator, for geometric parameter 348—349
Estimator, for normal parameters 353—354 383—384
Estimator, for Poisson parameter 352—353 402 411—412 422
Estimator, for slope and gamma-intercept (linear model) 679—681
Estimator, for uniform parameter 382—383 390—391 403 407 424—426
Estimator, for variance in linear model 683—684
Estimator, interval 363—364
Estimator, sufficient 398 401—402
Estimator, unbiasedness 381—385
Event 24
Expected value, conditional 677—679
Expected value, definition 175 199—201
Expected value, examples 174—182 227—228
Expected value, in method of moments estimation 357—358
Expected value, of functions 186—187 226—227 232—233 320—321
Expected value, of linear combinations 229
Expected value, of loss functions 420
Expected value, of sums 229—232
Expected value, relationship to median 182—183
Expected value, relationship to moment-generating function 261
Experiment 24
Experimental design 523 527—528 538—540 733 773—774 780 792—793 796—800
Exponential distribution, examples 161—163 167—168 183 222—223 242—243 290—292 333—337 351
Exponential distribution, memoryless property 256
Exponential distribution, moment-generating function 259
Exponential distribution, Moments 263
Exponential distribution, parameter estimation 351 385—386
Exponential distribution, relationship to Poisson distribution 289—290
Exponential distribution, threshold parameter 351
Exponential form 405
Exponential regression 662—666
F distribution, definition 475
F distribution, in analysis of variance 738—739 754—755 777
F distribution, in inferences about variance ratios 569 585
F distribution, relationship to chi square distribution 475
F distribution, relationship to Student t distribution 476—477
F distribution, table 476 857—871
Factorization theorem 403
Finite correction factor 375
Fisher’s lemma 516
Friedman’s test 832—833 848—849
Gamma distribution, additive property 330
Gamma distribution, definition 327 329
Gamma distribution, examples 328—329 331 359—362
Gamma distribution, moment-generating function 332
Gamma distribution, Moments 330
Gamma distribution, parameter estimation 359—362
Gamma distribution, relationship to chi square distribution 474
Gamma distribution, relationship to exponential distribution 327
Gamma distribution, relationship to normal distribution 474
Gamma distribution, relationship to Poisson distribution 327 415
Generalized likelihood ratio 463
Generalized likelihood ratio test (GLRT), definition 464
Generalized likelihood ratio test (GLRT), examples 464—465 516—521 569 577 593—595 607—608 735
Geometric distribution, definition 317—318
Geometric distribution, examples 317—321
Geometric distribution, memoryless property 319—320
Geometric distribution, moment-generating function 258 318
Geometric distribution, Moments 262 318
Geometric distribution, parameter estimation 348—349
Geometric distribution, relationship to negative binomial distribution 322
Geometric mean 385—386
Geometric probability 207—209
Hazard rate 173
Hypergeometric distribution, definition 139 155
Hypergeometric distribution, examples 141—146
Hypergeometric distribution, Moments 177 375 706
Hypergeometric distribution, relationship to binomial distribution 138—139
Hypothesis testing, critical region 433
Hypothesis testing, decision rule 428—432 455—459
Hypothesis testing, level of significance 434
Hypothesis testing, P-value 437—438
Hypothesis testing, Type I and Type II errors 447—459 747
Independence, effect of, on the expected value of a product 233
Independence, mutual versus pairwise 75—76
Independence, of events 44 70—72 75—77 627—630
Independence, of random variables 216 218
Independence, of regression estimators 683 728—731
Independence, of repeated trials 78—83
Independence, of sample mean and sample variance (normal data) 474 514—516
Independence, of sums of squares 737
Independence, tests for 627—631
Independent samples 524—525 529—532 554 733 796—800 822 826
Intersection 28
Interval estimate see "Prediction interval"
Joint cumulative distribution function 213—215
Joint probability density function 203—207 215
k-sample data 531—532 733—734 826—827
Kruskal — Wallis test 826—830 841—848
kurtosis 200—201
Law of small numbers 284—285
level of significance 434 437 447—448 457—458 747
Likelihood function 347
Likelihood ratio see "Generalized likelihood ratio"
Linear model, assumptions 677—679
Linear model, confidence intervals for parameters 688 690—691
Linear model, hypothesis tests 685 690—691 697
Linear model, parameter estimation 678—679 683—684
Logarithmic regression 666—668
Logistic regression 668—669
Loss function 419—422
Margin of error 372 423—424
Marginal probability density function 205 211—212 417
Maximum Likelihood Estimation see also "Estimation"
Maximum likelihood, estimation definition 347
Maximum likelihood, estimation examples 344—346 348—354 679
Maximum likelihood, estimation in goodness-of-fit testing 615—616
Maximum likelihood, estimation in regression analysis 679 683—684
Maximum likelihood, estimation properties 404—405 409
Mean free path 181
Mean square 740
Median 182—183 388 804
Median unbiased 388
Method of least squares see "Estimation"
Method of moments see "Estimation"
Minimum variance estimator 395
MINITAB calculations for cdf 272—273 337—338
MINITAB calculations for completely randomized one-factor design 762—764
MINITAB calculations for confidence intervals 366 512—513 596—597
MINITAB calculations for critical values 512
MINITAB calculations for Friedman’s test 848—849
MINITAB calculations for histograms 339—340
MINITAB calculations for independence 644—645 726—727
MINITAB calculations for Kruskal — Wallis test 847—848
MINITAB calculations for Monte Carlo analysis 333—339 366 385—386 424—426 432
MINITAB calculations for one-sample t test 513—514
MINITAB calculations for pdf 272 337 444
MINITAB calculations for randomized block design 800—801
MINITAB calculations for regression analysis 726—728
MINITAB calculations for robustness 495—498
MINITAB calculations for sample statistics 510—511
MINITAB calculations for Tukey confidence intervals 764—766
MINITAB calculations for two-sample t test 595—597
Model equation 529—537
Moment-generating function, as technique for finding distributions of sums 267
Moment-generating function, definition 257
Moment-generating function, examples 258—260 311
Moment-generating function, in proof of central limit theorem 341—342
Moment-generating function, properties 261 266
Moment-generating function, relationship to moments 261
Moments see "Variance"
Moore’s Law 663—666
Multinomial coefficients 101
Multinomial distribution 600—603 607
Multiple comparisons 747—750
Multiplication rule 86
Mutually exclusive events 29 71
Negative binomial distribution 157 322—326
Noncentral chi square distribution 767
Noncentral F distribution 768—769
Noninformative prior 412
Nonparametric statistics 803
Normal distribution, additive property 312
Normal distribution, approximation to Binomial distribution 292—293 297—299 338—339
Normal distribution, approximation to Poisson distribution 305—306
Normal distribution, approximation to sign test 804
Normal distribution, approximation to Wilcoxon signed rank statistic 817
Normal distribution, as limit for Student t distribution 470—472 478—479
Normal distribution, central limit theorem 302 341—342
Normal distribution, confidence interval for mean 364—369 482—485 582—585
Normal distribution, confidence interval for variance 501—504
Normal distribution, definition 308
Normal distribution, hypothesis test for mean (variance known) 435
Normal distribution, hypothesis test for mean (variance unknown) 490 493—498
Normal distribution, hypothesis test for variance 504
Ðåêëàìà