Ãëàâíàÿ    Ex Libris    Êíèãè    Æóðíàëû    Ñòàòüè    Ñåðèè    Êàòàëîã    Wanted    Çàãðóçêà    ÕóäËèò    Ñïðàâêà    Ïîèñê ïî èíäåêñàì    Ïîèñê    Ôîðóì   
blank
Àâòîðèçàöèÿ

       
blank
Ïîèñê ïî óêàçàòåëÿì

blank
blank
blank
Êðàñîòà
blank
Csiszar I., Körner J. — Information Theory: Coding Theorems for Discrete Memoryless Systems
Csiszar I., Körner J. — Information Theory: Coding Theorems for Discrete Memoryless Systems



Îáñóäèòå êíèãó íà íàó÷íîì ôîðóìå



Íàøëè îïå÷àòêó?
Âûäåëèòå åå ìûøêîé è íàæìèòå Ctrl+Enter


Íàçâàíèå: Information Theory: Coding Theorems for Discrete Memoryless Systems

Àâòîðû: Csiszar I., Körner J.

ßçûê: en

Ðóáðèêà: Computer science/

Ñòàòóñ ïðåäìåòíîãî óêàçàòåëÿ: Ãîòîâ óêàçàòåëü ñ íîìåðàìè ñòðàíèö

ed2k: ed2k stats

Ãîä èçäàíèÿ: 1982

Êîëè÷åñòâî ñòðàíèö: 452

Äîáàâëåíà â êàòàëîã: 03.04.2008

Îïåðàöèè: Ïîëîæèòü íà ïîëêó | Ñêîïèðîâàòü ññûëêó äëÿ ôîðóìà | Ñêîïèðîâàòü ID
blank
Ïðåäìåòíûé óêàçàòåëü
Graph, product, coverings      118 160
Graph, product, independence number and zero-error capacity      118
Gray, R.M.      158 352 383 401 416 422 424
Group code      see “Linear code (for binary channels)”
Haemers, W.      118 422
Hajnal, A.      422
Hamming boundary      86 89 96
Hamming distance      54
Hamming distance and conditional entropy      54
Hamming distance and mutual information      117 179 180
Hamming neighborhood      86
Hamming neighborhood, probability bound on      92
Hamming space, isoperimetric problem in      95
Hamming sphere      95
Hamming, R.V.      54 86 96 422
Han, Te-Sun      269 294 422
Hardy, G.H.      58 422
Haroutunian, E.A.      199 203 268 422
Harper, L.H.      95 422
Hartley, R.V.L.      28 422
Hartley’s information measure      28
Helper      251
Helper in arbitrary source network      266
Helpers theorem      255
High probability sets, minimum cardinality of      16
High probability sets, minimum mass of      17 24
Hoeffding, W.      43 44 422
Horibe, Y.      57 422
Horstein, M.      201 422
Hu, Guo-Ding      60 423
Huffman code      73
Huffman code, word length and probability in      75
Huffman, D.A.      73 423
Hypothesis testing      19 22 23 25 43
Image (of a set over a channel)      101 (see also “Image size”)
Image (of a set over a channel) and generated sequences      351
Image (of a set over a channel), quasi      350
Image size ($\eta$-image size)      101
Image size ($\eta$-image size) and $\varepsilon$-capacity of a set      106
Image size ($\eta$-image size) and information quantities      305
Image size ($\eta$-image size) problem      see “Image size problem”
Image size ($\eta$-image size) theorem      see “Image size theorem”
Image size ($\eta$-image size), asymptotic independence of q      106
Image size ($\eta$-image size), binary channels      347 348
Image size problem      304 ff
Image size problem, relation to entropy characterization problem      325 342 346
Image size problem, three channels      357
Image size problem, unrestricted      345
Image size theorem      329
Image size theorem, converse part      326
Image size theorem, direct part      328
Image size theorem, projections in      328
Image size theorem, projections in, degraded case      323 243
Images (of a set over a channel) quasi, mutual, are large      350
Inaccuracy      32
Indecomposable joint distribution      350
Independence number $\alpha(G)$ of a graph      118
Independence number $\alpha(G)$ of a graph and zero-error capacity      118
Independent sources, transmission of      286 292
Infinite code      80
information      1
Information content of a RV      7 17
Information for discrimination      see “Informational divergence”
Information gain      see “Informational divergence”
Information geometry      see “Divergence geometry”
Information measures      1 6 7 22 47 “Mutual “Informational “Common
Information measures and additive set functions      51 52
Information measures, additivity      49
Information measures, axiomatic and pragmatic approaches      22
Information measures, convexity      50
Information measures, Fisher’s      27
Information measures, Hartley’s      28
Information measures, individual sequences      48
Information measures, intuitive concept      6—7
Information measures, Kullback’s      see “Informational divergence”
Information measures, Renyi’s      see “Entropy”
Information measures, Shannon’s      28 47
Information metric      see “Entropy metric”
Information radius      147
Information source      see “Source”
information storage      1 6
Information transmission theorem      see “Source-channel transmission theorem”
Information, amount of      6 17 20 22
Information, common      402—405
Information, mutual      see “Mutual information”
Information, provided by an event      20
Information, provided by an event, and codeword length      75
Informational divergence      20
Informational divergence and variational distance      58
Informational divergence, conditional      31
Informational divergence, convexity      50
Informational divergence, decrease of, in indirect observation      see “Data processing lemma”
Informational divergence, geometry generated by      see “Divergence geometry”
Input alphabet      4 100 270
Input constraint $(c,\Gamma)$      108 129
Input constraint $(c,\Gamma)$, average      112 182
Input constraint $(c,\Gamma)$, capacity under      108
Input constraint $(c,\Gamma)$, reliability function under      182 192
Input constraints $(c,\Gamma)$, several      117
Input of a network      246
Input set      99
Input vertex      246
Instantaneous code      72
Interference channel      296
Intermediate vertex      246
Interval graph      118
Intuitive background      1 ff
Intuitive background, measuring information      6
Intuitive background, multi-terminal systems      8
Isomorphic sources      80
Isomorphy problem in ergodic theory      80
Isoperimetric problem      86 95
Jeffreys, H.      46 423
Jelinek, F.      46 78 136 160 194 423
Jerohin, V.D.      133 423
Joint type      30
Juxtaposition of codes      242
Juxtaposition of codes, MA codes      272
Karamata, J.      25 423
Karlin, S.      214 219 226 423
Karmazin, M.A.      116 423
Karush, J.      74 423
Katona, G.O.H.      75 79 83 85 95 420 423
Kemperman, J.H.B.      58 185 423
Kesten, H.      185
Kiefer, J.      222 423
Kobayashi, K.      269 422
Koerner, J.      96 119 122 160 181—185 203 233 264 265 268 269 347 349 351 352 357 358 378—383 395—405 413 416 418 420 421 423 424
Kolmogorov — Sinai theorem      80
Kolmogorov, A.N.      80 423
Komlos, J.      VIII 82 420
Korn, I.      187 423
Koselev, V.N.      264 423
Kraft inequality      72
Kraft inequality, generalized      73
Kraft, L.G.      72 73 424
Krause, R.M.      74 85 424
Kricevskil, R.E.      46 84 424
Kullback — Leibler information number      see “Informational divergence”
Kullback, S.      27 28 58 424
Large deviation probabilities for empirical distributions      43
Leibler, R.A.      27 28 424
Less noisy      349 408
Leung Yan Cheong, S.K.      299 419
Liao, H.J.      302 424
Limit of minimum transmission ratio (LMTR)      5 22 132
Limit of minimum transmission ratio (LMTR) and information measures      6—7
Limit of minimum transmission ratio (LMTR), AVS-AVC      225
Limit of minimum transmission ratio (LMTR), DMS-DMC      see “Source-channel transmission theorem”
Limit of minimum transmission ratio (LMTR), multiple-access channel      270 284 286
Limit of minimum transmission ratio (LMTR), source-channel network      282 283 292
Limit of minimum transmission ratio (LMTR), wire-tap channel      412
Linear code for channels      114 198
Linear code for sources      24
Linear code for sources, binary adder source network      399
Linear code, shifted      114
List code      196
List code capacity of AVC      230
List codes, error exponent for      196
List codes, zero error capacity for      196
Littlewood, J.E.      58 422
Ljubic, Ju.      82 424
Log-sum inequality      48
Longo, G.      VIII 41 46 188 420 424
Lovasz, L.      118 119 263 424
Lukacs, E.      IX
Lynch, T.J.      85 424
m-capacity region      see “Capacity region”
m-capacity, AVC      205 (see also “a-capacity and m-capacity”)
m-capacity, AVC, and zero-error capacity of a DMC      223
m-capacity, AVC, coding theorem, binary output      208
m-capacity, AVC, positivity      222
MA capacity      284
MA code (multiple-access code)      270
MA code (multiple-access code), block      271
MA code (multiple-access code), block, with stochastic encoders      284
Mac Leod, J.B.      26 419
Marcus, R.S.      74 85 425
Margulis, G.A.      96 424
Markov chain      11 54
Markov chain, double Markovity      402
Marton, K.      VIII 46 122 158 185 203 264 266 349 358 378—381 391—400 416 420 423 424
Massey, J.L.      41 424
Max-closure      344
Maximal code lemma      101
Maximal code lemma for compound channels      183
Maximal code lemma for two channels      316
Maximal code lemma, converse      104
Maximum likelihood decoder      111
Maximum likelihood decoder and minimum distance decoder      114 207
Maximum mutual information (MMI) decoder      117 164
Maximum mutual information (MMI) decoder, modified      176
Maximum probability of error      99 172
Maximum probability of error at output c, channel network      281
Maximum probability of error, capacity for      see “m-capacity”
Maximum probability of error, capacity region for      see “m-capacity region”
Maximum probability of error, family of channels      172
McEliece, R.J.      160 195 196 424 425
McMillan, B.      74 85 424
Mealy automaton      82
Mealy, G.H.      82
Message      2 3 99
Message of length k      3
Message set      99 270 279
Message vector      280
Message, random      3
Messages, addressing of      278
Meulen      see “Van der Meulen”
Minimum distance decoder      114
Minimum distance decoder, standard      206
Minimum distance in a codeword set      180 195
Minimum entropy decoder      265
Moore automaton      82
Moore, E.F.      77 82 422
More capable      116 349
More uniform distribution      25 58
Multi-terminal coding theorems      see “Source network coding theorems” “Channel
Multi-terminal systems      8 (see also “Source network” “Channel “Source-channel
Multiple source      see “Discrete memoryless multiple source”
Multiple-access channel (MAC)      270
Multiple-access channel (MAC) with feedback      298
Multiple-access channel (MAC) with s senders and r receivers      292
Multiple-access channel (MAC), capacity region (a-capacity region)      271
Multiple-access channel (MAC), capacity region (a-capacity region), alternative definition      271 277
Multiple-access channel (MAC), capacity region (a-capacity region), m-, differs      284
Multiple-access channel (MAC), coding theorem      275
Multiple-access channel (MAC), coding theorem, alternative form      278
Multiple-access channel (MAC), compound      288 293
Multiple-access channel (MAC), generalized      see “Channel network with one output vertex”
Multiple-access channel (MAC), LMTR for      270 283 286
Multiple-access channel (MAC), stochastic encoders      284
Multiple-access channel (MAC), two-input two-output      288
Multiple-access code      see “MA code”
Muroga, S.      148 424
Mutual information      21
Mutual information and common information      405
Mutual information of individual sequences      48
Mutual information of several RV’s      57
Mutual information, conditional      48
Mutual information, convexity      52
Mutual information, maximum, decoder      117 164
Mutual quasi-images      350
n-length block code      see “Block code”
Nemetz, T.O.H.      76 423 424
Network      246 (see also “Source network” “Channel “Source-channel
Network, code associated with a      246
Neuhoff, D.L.      158 424
Neyman — Pearson lemma      23
Neyman, J.      23
Ng, C.T.      27 417 424
Noiseless channel      4 6 67
Noiseless channel, capacity per unit cost      67
Noiseless channel, capacity per unit cost, definition by source-channel transmission      67
Noiseless channel, capacity per unit cost, direct definitions, alternative      67 69
Noiseless channel, capacity per unit cost, refined asymptotics      81
Noiseless channel, finite state      82
Noiseless channel, general      81 83
Noiseless coding theorem      see “Average cost theorem” “Average “Noiseless
Noisy Channel Coding Theorem      104 (see also “Discrete memoryless channel”)
Non-block code      see “Variable length code”
Non-finite distortion measure      134 148
Non-negativity of information measures      49
Non-terminating transmission      5
Non-terminating transmission, reliable      6 132
Normal channel network      300
Normal channel network, reduction to      300—302
Normal source network (NSN)      251
Normal source network (NSN) with three inputs and one helper      393—397
Normal source network (NSN) with two helpers      398 399
Normal source network (NSN) without helpers      251
Normal source network (NSN) without helpers, coding theorem      253—254
Normal source network (NSN) without helpers, error exponent      267—269
Normal source network (NSN) without helpers, universal coding      267
Normal source network (NSN), product space characterization of achievable rate region      255
Normal source network (NSN), reduction to      249—250
Omura, J.K.      159 184 194 195 424 425
One side information source      381
Optimal code, fixed-to-variable length      see “Huffman code”
Optimal code, variable-to-fixed length      78
Optimal points      242
Optimal transmission without coding      135
Optimistic and pessimistic point of view      111
Optimistic and pessimistic point of view, channel networks      301
Optimistic and pessimistic point of view, channels      110
Optimistic and pessimistic point of view, source networks      258
Optimistic and pessimistic point of view, sources      128
Ornstein, D.S.      80 425
Output alphabet      4 100 270
Output constraint      117
Output of a network      246
Output set      99
Output vertex      246
Packing lemma      162
Parallel channels      see “Product of channels”
Parity-check code      see “Linear code for binary channels”
Partial ordering of channels      see “Comparison of channels”
Partial ordering of distributions      see “More uniform distribution”
Partial side information      367 (see also “Source coding with side information”)
Path      346
Patterson, G.W.      74 425
1 2 3 4 5
blank
Ðåêëàìà
blank
blank
HR
@Mail.ru
       © Ýëåêòðîííàÿ áèáëèîòåêà ïîïå÷èòåëüñêîãî ñîâåòà ìåõìàòà ÌÃÓ, 2004-2024
Ýëåêòðîííàÿ áèáëèîòåêà ìåõìàòà ÌÃÓ | Valid HTML 4.01! | Valid CSS! Î ïðîåêòå