Главная    Ex Libris    Книги    Журналы    Статьи    Серии    Каталог    Wanted    Загрузка    ХудЛит    Справка    Поиск по индексам    Поиск    Форум   
blank
Авторизация

       
blank
Поиск по указателям

blank
blank
blank
Красота
blank
Ash R.B. — Information theory
Ash R.B. — Information theory



Обсудите книгу на научном форуме



Нашли опечатку?
Выделите ее мышкой и нажмите Ctrl+Enter


Название: Information theory

Автор: Ash R.B.

Аннотация:

Analysis of channel models and proof of coding theorems; study of specific coding systems; and study of statistical properties of information sources. Sixty problems, with solutions. Advanced undergraduate to graduate level.
Excellent introduction treats three major areas: analysis of channel models and proof of coding theorems; study of specific coding systems; and study of statistical properties of information sources. Appendix summarizes Hilbert space background and results from the theory of stochastic processes. Advanced undergraduate to graduate level. Bibliography.


Язык: en

Рубрика: Computer science/Теория информации/

Статус предметного указателя: Готов указатель с номерами страниц

ed2k: ed2k stats

Год издания: 1965

Количество страниц: 339

Добавлена в каталог: 10.11.2005

Операции: Положить на полку | Скопировать ссылку для форума | Скопировать ID
blank
Предметный указатель
Probability of error, for general binary codes      113 ff.
Probability of error, maximum      66
Random coding      66 67 74 110
Random process      275 ff.
Random process, covariance function of      250 256 275
Random process, Gaussian      250 256 279
Random process, second order      250 275
Random process, spectral density of      250 256 282
Random process, stationary      185
Random variables(s), conditionally independent      25
Random variables(s), Gaussian      231
Random variables(s), Gaussian, uncertainty of      240
Random variables(s), independent      see “Independent random variables”
Random variables(s), independent, noiseless coding problem for      27
Random variables(s), independent, uncertainty of      5 ff.
Random vectors      19 25 39 240 243
Rate of transmission      1 3 63
Rate of transmission, $\lambda$-permissible      224
Rate of transmission, critical, for the binary symmetric channel      117
Rate of transmission, permissible      223 234 251
Sampling theorem      258
Schwarz inequality      78 255 262
Sequence, input      65
Sequence, output      65
Sequence, typical      14 24 83 196
Sequence, typical in Shannon’s original proof of the fundamental theorem      66
Sequence, “meaningful”, produced by an information source      195 196 206
Sequential circuit, linear      163
Shannon — McMillan theorem      see “Asymptotic equipartition property”
Shift register      see “Feedback shift register”
Source of information      1 63 169 184
Source of information, alphabet of      172 185
Source of information, approximation of, by a source of finite order      189 ff.
Source of information, asymptotic equipartition property      197 223
Source of information, ergodic      197 202 207 208 223
Source of information, Markov      172 185
Source of information, Markov, indecomposable      185
Source of information, Markov, regular      185 202 223
Source of information, Markov, uncertainty of      186 219
Source of information, Markov, unifilar      187
Source of information, Markov, unifilar, connection matrix of      209
Source of information, Markov, unifilar, maximum uncertainty of      209
Source of information, Markov, unifilar, order of      189 ff.
Source of information, Markov, unifilar, uncertainty of      188
Source-channel matrix      215—217
Spectral density, of a random process      250 256 282
States, of a channel      46 215 230
States, of a Markov chain      171
Stationary distribution, of a finite Markov chain      174 181 184
Stationary Gaussian random process      250
Stationary sequence of random variables      1
Steady state probabilities of a finite Markov chain      174 176
Steady state probabilities of a finite Markov chain, effective determination of existence of      180 208
Stirling’s formula      113
Storage requirements, of a decoder      92 161
Strong converse to the fundamental theorem      83 223 224
Strong converse to the fundamental theorem, failure of      225
Strong converse to the fundamental theorem, for the binary symmetric channel      124
Strong converse to the fundamental theorem, for the time-discrete Gaussian channel      246
Syndrome ( = corrector)      94
Uncertainty      4 8
Uncertainty, average      5
Uncertainty, axioms for      5 ff. 24 26
Uncertainty, axioms for, grouping axiom      8 80 81
Uncertainty, axioms for, grouping axiom, generated      26
Uncertainty, conditional      19 219 229 238 241
Uncertainty, convexity of      54 81
Uncertainty, input and output      50
Uncertainty, interpretations of      12 ff.
Uncertainty, joint      18—21 238—240
Uncertainty, maximization of      17
Uncertainty, maximization of, for a unifilar Markov source      209
Uncertainty, n-gram, unigram, digram, and trigram      191
Uncertainty, of a discrete random variable      5 ff.
Uncertainty, of a discrete random variable given an absolutely continuous random variable, and vice versa      241
Uncertainty, of a function of a random variable      26
Uncertainty, of a Gaussian random variable      240
Uncertainty, of a language      206
Uncertainty, of a unifilar Markov source      188
Uncertainty, of an absolutely continuous random variable      236
Uncertainty, of an information source      186 219
Uncertainty, properties of      16 ff.
Unifilar Markov source      see “Source”
Uniform error bound      62 66
Varsharmov — Gilbert — Sacks condition      108 122 130 131 163 225
Vector space      126 147
Weak converse to the fundamental theorem, for the discrete memoryless channel      82 307
Weak converse to the fundamental theorem, for the finite state regular channel      223
Weak converse to the fundamental theorem, for the time-continuous Gaussian channel      252
Weak converse to the fundamental theorem, for the time-discrete Gaussian channel      234 245
Weak law of large numbers      3 130
Weak law of large numbers, exponential convergence in      83
Weak law of large numbers, for regular Markov chains      203
Weight, of a binary sequence      102
“Yes or no” questions      13 14
“Yes or no” questions, relation to instantaneous binary codes      40
1 2
blank
Реклама
blank
blank
HR
@Mail.ru
       © Электронная библиотека попечительского совета мехмата МГУ, 2004-2024
Электронная библиотека мехмата МГУ | Valid HTML 4.01! | Valid CSS! О проекте