Village Forum: God and Evolution - A Conversation - February 28th, 2019
At 1:45:30 – 1:46:00 Swamidass says
“It turns out that entropy is really a technical term, uhm, entropy is actually the same thing as, this is, you’re probably going to disagree with me but this is where my area is in computational biology, information theory, information is the same thing as entropy, the second law of thermodynamics guarantees that information increases [laughter] it just does.”
(transcribed from video)
This however is equivocation. Equivocation is;
- Logic . a fallacy caused by the double meaning of a word.
- The use of an ambiguous word, especially when deliberate, used in one meaning by the speaker but interpreted in a different meaning by the hearer.
Normally if we are aware of different meanings we will interpret the correct meaning from the context, and they can often be used in puns as a form of humor. However when the hearer is not aware of the alternative meaning it results in misunderstanding or confusion.
In this case Swamidass is using "information" in the technical sense from Information Theory, but the audience, not having a that background will understand it in the common usage sense.
Note that it is not uncommon for a word to have a specialist technical sense that is not the same as the common usage sense. In engineering "stress" is measure of the internal pressure created in an object and is measured in force per unit area, e.g. kN/m^2 or psi. In common use it will often mean emotional distress caused by a bad situation. Normally if we are aware of different meanings we will interpret the correct meaning from the context, and they can often be used in puns as a form of humor. However when the hearer is not aware of the alternative meaning it results in misunderstanding or confusion.
There are of course multiple definitions of information in English but they can be condensed into two definitions which are contradictory in an important aspect. In the classical and common meaning of the word information always has meaning in some sense. It communicates knowledge, conveys facts, reduces uncertainty.
In Information Theory meaning is not considered part of information and it is quite logical to have meaningless information. When Swamidass says that information increases he means meaningless information, or noise. In fact it is when the information is completely random that the entropy is the highest, i.e. it has the most information, and the least meaning.