Here are 37 books that An Introduction to Information Theory fans have personally recommended if you like
An Introduction to Information Theory.
Book DNA is a community of 12,000+ authors and super readers sharing their favorite books with the world.
My primary interest is in brain function. Because the principal job of
the brain is to process information, it is necessary to define exactly
what information is. For that, there is no substitute for Claude
Shannon’s theory of information. This theory is not only quite
remarkable in its own right, but it is essential for telecoms,
computers, machine learning (and understanding brain function).
I have written ten "tutorial introduction" books, on topics which vary
from quantum mechanics to AI.
In a parallel universe, I am still an Associate Professor at the
University of Sheffield, England.
Pierce was a contemporary of Claude Shannon (inventor of information theory), so he learned information theory shortly after it was published in 1949. Pierce writes in an informal style, but does not flinch from presenting the fundamental theorems of information theory. Some would say his style is too wordy, and the ratio of words/equations is certainly very high. Nevertheless, this book provides a solid introduction to information theory. It was originally published in 1961, so it is a little dated in terms of topics covered. However, because it was re-published by Dover in 1981, it is also fairly cheap. Overall, this is a sensible first book to read on information theory.
"Uncommonly good...the most satisfying discussion to be found." — Scientific American. Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for a second edition. Beginning with the origins…
It is April 1st, 2038. Day 60 of China's blockade of the rebel island of Taiwan.
The US government has agreed to provide Taiwan with a weapons system so advanced that it can disrupt the balance of power in the region. But what pilot would be crazy enough to run…
My primary interest is in brain function. Because the principal job of
the brain is to process information, it is necessary to define exactly
what information is. For that, there is no substitute for Claude
Shannon’s theory of information. This theory is not only quite
remarkable in its own right, but it is essential for telecoms,
computers, machine learning (and understanding brain function).
I have written ten "tutorial introduction" books, on topics which vary
from quantum mechanics to AI.
In a parallel universe, I am still an Associate Professor at the
University of Sheffield, England.
This is the modern standard text on information theory. It is both comprehensive and highly technical. The layout is spacey, and the authors make good use of the occasional diagram to explain geometric aspects of information theory. One feature I really like is the set of historical notes and a summary of equations at the end of each chapter.
The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The…
I started my career in neuroscience. I wanted to understand brains. That is still proving difficult, and somewhere along the way, I realized my real motivation was to build things, and I wound up working in AI. I love the elegance of mathematical models of the world. Even the simplest machine learning model has complex implications, and exploring them is a joy.
The best parts of this book really represent a gold standard in pedagogical clarity.
Although it’s now twenty years old, there is still much to learn from this rather unconventional book that covers the boundary between machine learning, information theory, and Bayesian methods. There are also odd tangents and curiosities, some of which work better than others but are never dull.
Just writing this review makes me want to go back to it and squeeze more out of it.
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo…
A Duke with rigid opinions, a Lady whose beliefs conflict with his, a long disputed parcel of land, a conniving neighbour, a desperate collaboration, a failure of trust, a love found despite it all.
Alexander Cavendish, Duke of Ravensworth, returned from war to find that his father and brother had…
I am a financial data scientist. I think it is important that data scientists are highly specialized if they want to be effective in their careers. I run a business called Conlan Scientific out of Charlotte, NC where me and my team of financial data scientists tackle complicated machine learning problems for our clients. Quant trading is a gladiator’s arena of financial data science. Anyone can try it, but few succeed at it. I am sharing my top five list of math books that are essential to success in this field. I hope you enjoy.
While studying computer networks, Claude Shannon did something pretty impressive. He reformulated the majority of classical statistics from scratch using the language and concepts of computer science.
Statistical noise? There’s a new word for that; it’s called entropy. Also, it turns out it is a good thing, not a bad thing because entropy is equal to the information content or a data set. Tired of minimizing the squared error of everything? That’s fine, minimize the log of its likelihood instead. It does the same thing. This book challenges the assumptions of classical statistics in a way that fits neatly in the mind of a computer scientist. As a quant trader, this book will help you understand and measure the information content of data, which is critical to your success.
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.
I taught myself to code back in 1994 while working the graveyard shift as a geologist in the environmental industry. My job consisted of sitting in a chair during the dark hours of the night in a shopping center in Stockton, CA, watching another geologist take samples from wells in the parking lot. A friend of mine suggested I learn to code because I liked computers. I don’t mean to make this out to be a “it’s so simple anyone can do it!” You need to have a relentless drive to learn, which is why I wrote my book, The Imposter’s Handbook - as an active step to learning what I didn’t know I didn’t know.
You’ve heard of Einstein, Turing, Newton, and Hawking - but do you know who Claude Shannon is? Would you be surprised if I told you that he’s probably done more for our current way of life than all of the others combined? It’s true, and it’s unbelievable.
Claude Shannon was a quiet, quirky man who had what you might call The Most Genius Move of the last foreveryears: he took an obscure discipline of mathematics (Boolean Algebra) and applied it to electrical circuits, creating the digital circuit in the process. If you’ve ever wondered how 1s and 0s are turned into if statements and for loops - well here you go.
Oh, but that’s just the beginning. Dr. Shannon took things much further when he described how these 1s and 0s could be transmitted from point A to point B without loss of data. This was a big problem…
Winner of the Neumann Prize for the History of Mathematics
**Named a best book of the year by Bloomberg and Nature**
**'Best of 2017' by The Morning Sun**
"We owe Claude Shannon a lot, and Soni & Goodman’s book takes a big first step in paying that debt." —San Francisco Review of Books
"Soni and Goodman are at their best when they invoke the wonder an idea can instill. They summon the right level of awe while stopping short of hyperbole." —Financial Times
"Jimmy Soni and Rob Goodman make a convincing case for their subtitle while reminding us that Shannon…
My name is Daniel Robert McClure, and I am an Associate Professor of History at Fort Hays State University in Hays, Kansas. I teach U.S., African diaspora, and world history, and I specialize in cultural and economic history. I was originally drawn to “information” and “knowledge” because they form the ties between culture and economics, and I have been teaching history through “information” for about a decade. In 2024, I was finally able to teach a graduate course, “The Origins of the Knowledge Society,” out of which came the “5 books.”
This book starts in a similar historical location as Bod’s book but quickly moves through the nineteenth and twentieth centuries—settling into the “information theory” era established by Claude Shannon, Norbert Wiener, and others in the 1940s-1960s.
I love this book because it situates the intellectual climate leading to our current dystopia of information overload. Gleick’s teasing of chaos theory inevitably pushes the reader to explore his book on the subject from the 1980s: Chaos: Making a New Science (1987).
Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing.
We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code.
In 'The Information' James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the 'bit', it is a fascinating…
The Duke's Christmas Redemption
by
Arietta Richmond,
A Duke who has rejected love, a Lady who dreams of a love match, an arranged marriage, a house full of secrets, a most unneighborly neighbor, a plot to destroy reputations, an unexpected love that redeems it all.
Lady Charlotte Wyndham, given in an arranged marriage to a man she…
I’m a British neuroscientist and writer who’s been using computers to study the brain since 1998, and writing about it since 2016. How I ended up a neuroscientist is hard to explain, for my formative years were spent devouring science books that were not about the brain. That’s partly because finding worthwhile books about the brain is so hard – few delve into how the brain actually works, into the kinds of meaty details that, for example, Hawking offered us on physics and Dawkins on evolution. So I wrote one to solve that problem; and the books on my list are just that too: deep, insightful works on how the brain does what it does.
A magnificent synthesis of Bialek and colleagues’ research into how spikes from neurons send information. A strong contender for the most readable serious science book ever published. Even if you only understand a quarter of it (as I did on first reading as a math-shy grad student), the sheer quantity of ideas and the flow of the prose is mind-blowing. As essential a read now as it was in 1997, these ideas have not dated one bit.
What does it mean to say that a certain set of spikes is the right answer to a computational problem? In what sense does a spike train convey information about the sensory world? Spikes begins by providing precise formulations of these and related questions about the representation of sensory signals in neural spike trains. The answers to these questions are then pursued in experiments on sensory neurons. Intended for neurobiologists with an interest in mathematical analysis of neural data as well as the growing number of physicists and mathematicians interested in information processing by "real" nervous systems, Spikes provides a…
I have researched and observed attempts to map, enhance, and control biological human bodies since I was a teenager. I was always interested in how people described and related to themselves as biological creatures. As part of that, I was fascinated by attempts to talk about the human body with other words than the strict biological, both by poets, artists and by, entrepreneurs, and scientists. As a researcher in cultural studies, I concentrate on different ways to understand ourselves as biological creatures and on imaginaries about (bio)technology and how these dreams about what technology can do affect our self-understanding.
When you finish the book, you may feel a bit unsure whether this was a magical tale or an account of reality. However, it is actually a rather detailed account of the period (1953–1970) in scientific history when the information age found its way into biology. This was a time when metaphors migrated from the realm of computing to descriptions of the human biological body.
Conducting new research requires new languages to test novel ideas and explore new perspectives. This fascinating subject is described in great detail in this book without ever becoming dry. As a researcher, I can attest that this is not easy to achieve.
This is a detailed history of one of the most important and dramatic episodes in modern science, recounted from the novel vantage point of the dawn of the information age and its impact on representations of nature, heredity, and society. Drawing on archives, published sources, and interviews, the author situates work on the genetic code (1953-70) within the history of life science, the rise of communication technosciences (cybernetics, information theory, and computers), the intersection of molecular biology with cryptanalysis and linguistics, and the social history of postwar Europe and the United States.
I have always been fascinated by how the human mind adapts, both individually and through history. Julian Jaynes, who taught me while pursuing my PhD in anthropology from Princeton University, provided me with a theoretical framework to explore how the personal and cultural configure each other. Jaynes inspired me to publish on psychotherapeutics, the history of Japanese psychology, linguistics, education, nationalism, the origin of religion, the Bible, ancient Egypt, popular culture, and changing definitions of self, time, and space. My interests have taken me to China and Japan, where I lived for many years. I taught at the University of Arizona and currently work as a licensed mental health counselor.
Supported by a wide range of examples drawn from various disciplines, this book demonstrates how we are only conscious of a small amount of what our hidden psychological machinery manufactures nonconsciously.
This work provides a key perspective needed to appreciate Julian Jaynes’s theory of consciousness and, thus his ideas on bicameral mentality.
As John Casti wrote, "Finally, a book that really does explain consciousness." This groundbreaking work by Denmark's leading science writer draws on psychology, evolutionary biology, information theory, and other disciplines to argue its revolutionary point: that consciousness represents only an infinitesimal fraction of our ability to process information. Although we are unaware of it, our brains sift through and discard billions of pieces of data in order to allow us to understand the world around us. In fact, most of what we call thought is actually the unconscious discarding of information. What our consciousness rejects constitutes the most valuable part…
This book follows the journey of a writer in search of wisdom as he narrates encounters with 12 distinguished American men over 80, including Paul Volcker, the former head of the Federal Reserve, and Denton Cooley, the world’s most famous heart surgeon.
In these and other intimate conversations, the book…
My father, when he consented to talk about all the moments in his life when the odds against his survival were so small as to make them statistically non-existent, would say, ‘I was lucky.’ Trying to understand what he meant got me started on this book. As well as being a novelist, I’m a poker player. Luck is a subject that every poker player has a relationship to; more importantly it’s a subject that every person has a relationship to. The combination of family history and intellectual curiosity and the gambler’s desire to win drove me on this quest.
Sadly, Games, Gods, and Gambling by FN David is out of print.This is the next best thing. Lorraine Daston has the supreme gift of making the complicated idea seem straightforward. This is an account of the frenzy for measuring that happened in the 18th century, and how it made the world we live in today, when the gambler’s eye for odds has become the algorithm of taming chance that guides all our decisions.
What did it mean to be reasonable in the Age of Reason? Classical probabilists from Jakob Bernouli through Pierre Simon Laplace intended their theory as an answer to this question--as "nothing more at bottom than good sense reduced to a calculus," in Laplace's words. In terms that can be easily grasped by nonmathematicians, Lorraine Daston demonstrates how this view profoundly shaped the internal development of probability theory and defined its applications.