Another Comment on Gauger and Swamidass at Village Forum

Science
(Chris) #1

Village Forum: God and Evolution - A Conversation - February 28th, 2019

At 1:45:30 – 1:46:00 Swamidass says

“It turns out that entropy is really a technical term, uhm, entropy is actually the same thing as, this is, you’re probably going to disagree with me but this is where my area is in computational biology, information theory, information is the same thing as entropy, the second law of thermodynamics guarantees that information increases [laughter] it just does.”

(transcribed from video)

This however is equivocation. Equivocation is;

  • Logic . a fallacy caused by the double meaning of a word.
  • The use of an ambiguous word, especially when deliberate, used in one meaning by the speaker but interpreted in a different meaning by the hearer.

Normally if we are aware of different meanings we will interpret the correct meaning from the context, and they can often be used in puns as a form of humor. However when the hearer is not aware of the alternative meaning it results in misunderstanding or confusion.

In this case Swamidass is using "information" in the technical sense from Information Theory, but the audience, not having a that background will understand it in the common usage sense.

Note that it is not uncommon for a word to have a specialist technical sense that is not the same as the common usage sense. In engineering "stress" is measure of the internal pressure created in an object and is measured in force per unit area, e.g. kN/m^2 or psi. In common use it will often mean emotional distress caused by a bad situation. Normally if we are aware of different meanings we will interpret the correct meaning from the context, and they can often be used in puns as a form of humor. However when the hearer is not aware of the alternative meaning it results in misunderstanding or confusion.

There are of course multiple definitions of information in English but they can be condensed into two definitions which are contradictory in an important aspect. In the classical and common meaning of the word information always has meaning in some sense. It communicates knowledge, conveys facts, reduces uncertainty.

In Information Theory meaning is not considered part of information and it is quite logical to have meaningless information. When Swamidass says that information increases he means meaningless information, or noise. In fact it is when the information is completely random that the entropy is the highest, i.e. it has the most information, and the least meaning.

4 Likes
(Ashwin S) #2

Welcome @aarceng.

It’s a good point and actually impacts communication IMO. Another such problem word is “function”.
Sometimes I wonder whether scientists themselves get confused by these differences.
Edit: To be fair to @swamidass, I have not listened to this talk. Perhaps he explained the way he is using “information” beforehand.

1 Like
(S. Joshua Swamidass) #3

Glad to see you here @aarceng. No equivocation. Information = entropy. They are the same thing.

1 Like
(Ashwin S) #4

His main point is that the claim that “information” increases is true only if the technical definition of information in information in information theory is used. That’s a fair point.
If the normal English usage of information is considered, it’s a false statement.

As to the statement that 'information" is “entropy” , I am not sure that’s true either. Entropy is a concept in thermodynamics which has no connection to information. Entropy in its simplest sense, is the amount of energy available in a system which is unavailable for useful work.
What is the corresponding relationship between information and “work” or function?

1 Like
(S. Joshua Swamidass) #5

We’ve covered this before. The formula’s are identical, and information theory is identified as identical to entropy.

That is it. And ID uses the technical definition.

Which is not how anyone should think about information if we are discussing ID. Equating the normal english usage with the technical term is a major problem.

1 Like
(Ashwin S) #6

Entropy can also be considered as the lack of information or loss of information about the micro states of a system.

(Chris) #7

The equivocation is between the different meanings of “information”.
In everyday use meaning is intrinsic to information, so you can’t have meaningless information.
In the technical use you can have meaningless information. Information with totally random content would be meaningless information.

1 Like
#8

Not true. Information and entropy are identical. It was Shannon’s incredible insight in his 1948 paper.
Using information theory, it has been shown that the Heisenberg Uncertainty principle and the Wave-particle duality are manifestations of the same thing.

2 Likes
#9

This is a factual statement by @swamidass

1 Like
(Neil Rickert) #10

There are many different and conflicting normal ways of using “information.”

(Ashwin S) #11

All the more reason to be clear what exactly one is talking about.

1 Like
(S. Joshua Swamidass) #12

Well. I am clear. @aarceng read my thread from a while ago at BioLogos. I also have explained it here: Information = Entropy and Chance = Choice. There is also this one too: https://discourse.peacefulscience.org/t/where-does-shannon-equate-entropy-and-information/2220

I’m not sure how to be more clear.

(Chris) #13

You could be more clear when talking to a non technical audience by specifying that in information specified in accordance with information theory there is no requirement that it have any meaning, and indeed entropy is maximised when it has no meaning.
This is contradictory to the classical and common usage of information in which information intrinsically has meaning.

Failure to do this is equivocation.

(S. Joshua Swamidass) #14

@aarceng in context the audience member referenced an ID argument. I agree it is an equivocation that IDists have not made clear that information = entropy. Glad we agree there.

Information theory doesn’t use that definition. Amazing how this is equivocated by ID, right?

1 Like
(George) #15

@aarceng

Oh for goodness sake. Entropy is not what you think it is, nor does it do what you think it does.

Defining DEVOLUTION as “loss of information” is also equivocation and failure.

Evolution is not defined by a gain or a loss in information. This is a contrived definition, which doesn’t hold up under even the most basic of reviews.

A giant chaotic cloud of hydrogen, in the middle of space, has no problem ORGANIZING itself into a ball of nuclear fusion… and then creating new elements from the hydrogen.

1 Like
(Chris) #16

(S. Joshua Swamidass) #17

True, ID arguments about information are a red herring. Great to see you coming along. That is what you mean, right?

1 Like
(system) closed #18

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.