ShopSpell

Entropy and Information Theory [Paperback]

$154.99     $199.99   23% Off     (Free Shipping)
100 available
  • Category: Books (Computers)
  • Author:  Gray, Robert M.
  • Author:  Gray, Robert M.
  • ISBN-10:  1489981322
  • ISBN-10:  1489981322
  • ISBN-13:  9781489981325
  • ISBN-13:  9781489981325
  • Publisher:  Springer
  • Publisher:  Springer
  • Binding:  Paperback
  • Binding:  Paperback
  • Pub Date:  01-Mar-2014
  • Pub Date:  01-Mar-2014
  • SKU:  1489981322-11-SPRI
  • SKU:  1489981322-11-SPRI
  • Item ID: 100770694
  • List Price: $199.99
  • Seller: ShopSpell
  • Ships in: 5 business days
  • Transit time: Up to 5 business days
  • Delivery by: Dec 01 to Dec 03
  • Notes: Brand New Book. Order Now.

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.

New in this edition:

  • Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes
  • Expanded discussion of results from ergodic theory relevant to information theory
  • Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources
  • New material on trading off information and distortion, including the Marton inequality
  • New material on the properties of optimal and asymptotically optimal source codes
  • New material on the relationships of source coding and rate-constrained simulation or modeling of random processes

Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

This fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.

Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densitilă°

Add Review