I guess that is as close to an apology one will ever get from Yorzhik. Just a few days ago, I apologized to someone here, and I also apologized to Knight for any slights I may have given him. (I'll find them, if you want me to show you) I realize that you're proud and don't apologize. It's O.K. To get back to the topic:
That's another issue. But meaning isn't what this is about.
This is where Yorzhik errs. He still doesn't get it. Noise adds information to the message; "improvement" is not part of his theory. He's still hung up on intent, which is not part of Shannon's theory, either.
Apply this to genetics. It actually works better for population genetics than it does for communication media:
IEEE Eng Med Biol Mag. 2006; 25(1): 30–33.
Claude Shannon: Biologist
The Founder of Information Theory Used Biology to Formulate the Channel Capacity
Claude Shannon founded information theory in the 1940s. The theory has long been known to be closely related to thermodynamics and physics through the similarity of Shannon's uncertainty measure to the entropy function. Recent work using information theory to understand molecular biology has unearthed a curious fact: Shannon's channel capacity theorem only applies to living organisms and their products, such as communications channels and molecular machines that make choices from several possibilities. Information theory is therefore a theory about biology, and Shannon was a biologist.
...
Shannon's work at Bell Labs in the 1940s led to the publication of the famous paper “A Mathematical Theory of Communication” in 1948 [5] and to the lesser known but equally important “Communication in the Presence of Noise” in 1949 [6]. In these groundbreaking papers, Shannon established information theory. It applies not only to human and animal communications, but also to the states and patterns of molecules in biological systems.
...
Suppose one wishes to transmit some information at a rate R, also in bits per second (b/s). First, Shannon showed that when the rate exceeds the capacity (R > C), the communication will fail and at most C b/s will get through. A rough analogy is putting water through a pipe. There is an upper limit for how fast water can flow; at some point, the resistance in the pipe will prevent further increases or the pipe will burst.
The surprise comes when the rate is less than or equal to the capacity (R ≤ C). Shannon discovered, and proved mathematically, that in this case one may transmit the information with as few errors as desired! Error is the number of wrong symbols received per second. The probability of errors can be made small but cannot be eliminated. Shannon pointed out that the way to reduce errors is to encode the messages at the transmitter to protect them against noise and then to decode them at the receiver to remove the noise. The clarity of modern telecommunications, CDs, MP3s, DVDs, wireless, cellular phones, etc., came about because engineers have learned how to make electrical circuits and computer programs that do this coding and decoding. Because they approach the Shannon limits, the recently developed Turbo codes promise to revolutionize communications again by providing more data transmission over the same channels
DNA works like that. But it's not quite perfect. It has an error rate, even after error correction. The error rates for given organisms are, as you might expect, close to optimum for the optimum rate of variation in for asexual or sexual organisms, and for the normal environment.
This causes no end of confusion for those who are unable to realize what "information" is. This is why it's so hard for someone unfamiliar with the theory to understand that any mutation in a population genome increases information.