Нашли опечатку? Выделите ее мышкой и нажмите Ctrl+Enter
Название: Information Theory, Inference & Learning Algorithms
Автор: MacKay D.
Аннотация:
This book is aimed at senior undergraduates and graduate students in Engineering,
Science, Mathematics, and Computing. It expects familiarity with
calculus, probability theory, and linear algebra as taught in a first- or secondyear
undergraduate course on mathematics for scientists and engineers.
Conventional courses on information theory cover not only the beautiful
theoretical ideas of Shannon, but also practical solutions to communication
problems. This book goes further, bringing in Bayesian data modelling,
Monte Carlo methods, variational methods, clustering algorithms, and neural
networks.
Why unify information theory and machine learning? Because they are
two sides of the same coin. In the 1960s, a single field, cybernetics, was
populated by information theorists, computer scientists, and neuroscientists,
all studying common problems. Information theory and machine learning still
belong together. Brains are the ultimate compression and communication
systems. And the state-of-the-art algorithms for both data compression and
error-correcting codes use the same tools as machine learning.