

Beschreibung
With a large number of examples, illustrations, and original problems, this book contains a thorough discussion of the classical topics in information theory together with the first comprehensive treatment of network coding. This book is an evolution from my b...With a large number of examples, illustrations, and original problems, this book contains a thorough discussion of the classical topics in information theory together with the first comprehensive treatment of network coding.
This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research ?eld of its own in information science. With its root in infor- tion theory, network coding has not only brought about a paradigm shift in network communications at large, but also had signi?cant in?uence on such speci?c research ?elds as coding theory, networking, switching, wireless c- munications,distributeddatastorage,cryptography,andoptimizationtheory. While new applications of network coding keep emerging, the fundamental - sults that lay the foundation of the subject are more or less mature. One of the main goals of this book therefore is to present these results in a unifying and coherent manner. While the previous book focused only on information theory for discrete random variables, the current book contains two new chapters on information theory for continuous random variables, namely the chapter on di?erential entropy and the chapter on continuous-valued channels. With these topics included, the book becomes more comprehensive and is more suitable to be used as a textbook for a course in an electrical engineering department.
A solution manual is available upon request by course instructors. An indispensible reference for researchers in the fields. Modern treatment of information theory that combines with new topic of network coding. Includes supplementary material: sn.pub/extras
Autorentext
Shenghao Yang was born in China on March 19, 1978. He received his B.S. degree from Nankai University in 2001, an M.S. degree from Peking University in 2004, and a Ph.D. degree in Information Engineering from The Chinese University of Hong Kong in 2008. He was a visiting student at the Department of Informatics, University of Bergen, Norway in Spring 2017. He was a Postdoctoral Fellow in the University of Waterloo from 2008 to 2009 and in the Institute of Network Coding, The Chinese University of Hong Kong from 2010 to 2012. He was with the Tsinghua University from 2012 to 2015 as an Assistant Professor. He is currently a Research Assistant Professor at The Chinese University of Hong Kong, Shenzhen. His research interests include network coding, information theory, coding theory, network computation, big data processing, and quantum information. He has published more than 40 papers in international journals and conferences. He is a co-inventor of BATS code and has two U.S. patents granted. Raymond W. Yeung was born in Hong Kong on June 3, 1962. He received B.S., M.Eng., and Ph.D. degrees in electrical engineering from Cornell University, Ithaca, NY, in 1984, 1985, and 1988, respectively. He was on leave at Ecole Nationale Superieure des Telecommunications, Paris, France, during Fall 1986. He was a Member of Technical Staff of AT&T Bell Laboratories from 1988 to 1991. Since 1991, he has been with the Department of Information Engineering at The Chinese University of Hong Kong, where he is now Choh-Ming Li Professor of Information Engineering. Since January 2010, he has been serving as Co-Director of the Institute of Network Coding at The Chinese University of Hong Kong. He was a Consultant at the Jet Propulsion Laboratory in Pasadena, CA, on a project to salvage the malfunctioning Galileo Spacecraft, and a Consultant for NEC, USA. He is the author of the textbooks A First Course in Information Theory (Kluwer Academic/Plenum 2002) and its revision Information Theory and Network Coding (Springer 2008), which have been adopted by over 100 institutions around the world. In Spring 2014, he gave the first MOOC in the world on information theory that reached over 25,000 students. His research interests include information theory and network coding. Dr. Yeung was a member of the Board of Governors of the IEEE Information Theory Society from 1999 to 2001. He has served on the committees of a number of information theory symposiums and workshops. He was General Chair of the First and the Fourth Workshop on Network, Coding, and Applications (NetCod 2005 and 2008), a Technical Co-Chair for the 2006 IEEE International Symposium on Information Theory, and a Technical Co-Chair for the 2006 IEEE Information Theory Workshop, Chengdu, China. He currently serves as an Editor-at-Large of Communications in Information and Systems, an Editor of Foundation and Trends in Communications and Information Theory and of Foundation and Trends in Networking, and was an Associate Editor for Shannon Theory of the IEEE Transactions on Information Theory from 2003 to 2005. He was a recipient of the Croucher Foundation Senior Research Fellowship for 2000/2001, the Best Paper Award (Communication Theory) of the 2004 International Conference on Communications, Circuits and System, the 2005 IEEE Information Theory Society Paper Award, the Friedrich Wilhelm Bessel Research Award of the Alexander von Humboldt Foundation in 2007, and the 2016 IEEE Eric E. Sumner Award (for "pioneering contributions to the field of network coding"). In 2015, he was named an Outstanding Overseas Chinese Information Theorist by the China Information Theory Society. He is a Fellow of the IEEE, Hong Kong Academy of Information Sciences, and Hong Kong Institution of Engineers.
Klappentext
Information Theory and Network Coding consists of two parts: Components of Information Theory, and Fundamentals of Network Coding Theory. Part I is a rigorous treatment of information theory for discrete and continuous systems. In addition to the classical topics, there are such modern topics as the I-Measure, Shannon-type and non-Shannon-type information inequalities, and a fundamental relation between entropy and group theory. With information theory as the foundation, Part II is a comprehensive treatment of network coding theory with detailed discussions on linear network codes, convolutional network codes, and multi-source network coding.
Other important features include:
Derivations that are from the first principle
A large number of examples throughout the book
Many original exercise problems
Easy-to-use chapter summaries
Two parts that can be used separately or together for a comprehensive course **
**
Information Theory and Network Coding is for senior undergraduate and graduate students in electrical engineering, computer science, and applied mathematics. This work can also be used as a reference for professional engineers in the area of communications.
Inhalt
The Science of Information.- The Science of Information.- Fundamentals of Network Coding.- Information Measures.- Information Measures.- Zero-Error Data Compression.- Weak Typicality.- Strong Typicality.- Discrete Memoryless Channels.- Rate-Distortion Theory.- The BlahutArimoto Algorithms.- Differential Entropy.- Continuous-Valued Channels.- Markov Structures.- Information Inequalities.- Shannon-Type Inequalities.- Beyond Shannon-Type Inequalities.- Entropy and Groups.- Fundamentals of Network Coding.- The Max-Flow Bound.- Single-Source Linear Network Coding: Acyclic Networks.- Single-Source Linear Network Coding: Cyclic Networks.- Multi-source Network Coding.