Evolution and the Second Law of Thermodynamics

Even if entropy is information (which it isn’t) it doesn’t follow that there is a guarantee that information in DNA is guaranteed to increase.

According to Leon Brillouin (1964), "Information represents a negative contribution to entropy. He calls it “The negentropy principle of information.”

He claims it is negentropy that is the equivalent of information (p. 12).

So now we have entropy = negentropy = information.

@swamidass, in your equation -log p = H, what is p the probability of?

@swamidass This is getting ridiculous. @mung know nothing about Information Theory, Thermodynamics and most other sciences. Let’s stop wasting our time on this nonsense.

The last time you appealed to @swamidass he closed the thread. Are you expecting the same result this time, after he reversed his decision that last time you made an appeal?

Why don’t you and he just make the case for your position instead of relying on erroneous evaluations of my ignorance to shut down discussion? Your own actions indicate that you want to suppress any opinion that might disagree with you. That’s sad, really, and not in keeping with he goals of the site.

I’ve already sufficiently demonstrated that I know enough about entropy, information, and thermodynamics to call into question the veracity of your claim that I am ignorant. You need to deal with the arguments. The sooner the better.

You’ve seen the list of books I’ve studied relative to Information Theory. You’ve seen a list of the books I’ve studied relative to the relationship of information theory to thermodynamics and statistical mechanics. I could post a list of the books on thermodynamics I’ve studied. But I’m certain that you would ignore that too.

If you don’t want to engage in discussion why don’t you just remain silent? If you can show that I am wrong or mistaken about something I post, why not point it out? Be specific and show why it is wrong or mistaken.

These accusations of yours that I don’t know what I am talking about are based on your own ignorance. I gave you a list of IT textbooks that I own and have read for you to refer to. You declined to show how standard texts on information theory support your claims. Why is that?

Because you are spouting off nonsense. There is a base of science that is universally accepted as foundational. Thermodynamics and Information Theory are two of them. To really grasp them, graduate school study is really required. Not to be too harsh on you, but you are not at the educational level to even be considered a novice in these subjects. You can’t even get a firm foundation of the basics.

You may own a lot of books on chess. And perhaps have read them, but it doesn’t make you a Chess Grandmaster. The same thing here. Combined there are a lot of degrees here, a lot of PhDs. You still trying to figure out how the Knight moves among a lot of Grandmasters.

1 Like

Macroscopic quantities are qualities that refer to the system as a whole. Examples are temperature and pressure. In contrast, microscopic quantities refer to the quality of a particular component of the system. Examples include the position and velocity of a particular particle in a gas.

Macroscopic quantities are quantities that in one way or another “averages” the microscopic quantities.

Entropy depends on these macroscopic quantities, as will be clear from the answer to your second question:

The number of state is a function of macroscopic quantities, i.e. S = k ln Ω(T, P, etc).

A state function is a function that depends only on the situation (state) of the system at a particular instance in the system’s evolution (usually the state of the system at present). In contrast, consider the following function: Q=∫Tdt, where T is temperature of the system and t is time. Clearly Q does not depend just on the state of the system at present. Entropy is a state variable; it only depends on the state of the system at present, and does not care about how the system arrives at that state.

1 Like

Yes, it is All Saints Day. I liked being in Catholic Grammar School because we would have the day after the atheist holiday of Halloween off so that we can eat candy all day. I knew we were suppose to go to mass on All Saints Day but the “I have a stomach ache” worked every year. :sunglasses:

“Bless me Father for I have sinned. I ate all my Halloween candy and couldn’t go to Mass because I had a stomach ache” For your Penance say ten Hail Mary’s and ten Our Fathers, and send some candy to the poor kids in Africa. :rofl:

2 Likes

Likewise, if evolutionary processes in living things violates the 2nd LoT, then so does a seed sprouting and eventually becoming a huge tree. When I’ve try tried to walk some evolution-deniers through this, they invented strange excuses like, “Biological life doesn’t conform to the 2nd LOT!” and “God designed living things to defy the 2nd LOT.”

They also tend to ignore all of the entropy increases occurring in and around the seed and eventual tree. (e.g., plant metabolism producing heat; transpiration from the leaves)

I have long wondered if Isaac Asimov encouraged misunderstandings of entropy when he explained entropy and “increases in disorder” using the teenager’s bedroom getting cluttered soon after it was cleaned up. Asimov wasn’t necessarily wrong but I think the typical reader was prone to misapply the illustration.

When I had such a conversation about entropy with John Whitcomb Jr. long ago (co-author of The Genesis Flood), it was apparent that the cluttered bedroom analogy was the extent of his knowledge of entropy.

I just love analogies!

I do in fact own a lot of books on chess. I’ve been playing chess for decades. Yet I am not a Grandmaster. I know how a knight moves, and it doesn’t take a Grandmaster to know how a knight moves. Does it follow that I am ignorant and know nothing about chess? Of course not.

If it’s the same thing then you fail. Try again.

Then let’s play chess. I am on chess.com PatrickTNJ

It occurs to me that losses in chess might sometimes be attributed to underestimating your opponent. A search of chess opening databases reveals that @Patrick loses more often than he wins when he plays the French Defense. He must be ignorant! But perhaps Patrick understands the opening better than I do. Nah.

If we actually played chess you might discover that I know how a knight moves. I’m not convinced that you could handle that.

I very simple question for @Patrick: In the following equation, what is the meaning of “p”?

-log p = H

Does it refer to a probability? If so, the probability of what? pi where pi is the probability is some event or the probability of the outcome of some experiment? Something else?

Given that -log p = H is the foundation of your insistence that information equals entropy shouldn’t you be able to give an answer to my question? If not, why not?

I never play the French defense.

p refers to a particular state the system is in or a particular message p to be sent.

You can’t be serious.

How on earth did science manage to survive before Shannon’s paper?

Even if Shannon’s paper had never been published thermodynamics would still be accepted. The applicability of Information Theory to thermodynamics is still debated. What is it, specifically, within thermodynamics that could not survive had Shannon’s paper never been published?

11 posts were split to a new topic: Information in 10 Coins on a Desk

At least five topics are being mixed in the latter part of this thread. They are:
Equilibrium thermodynamics of macroscopic systems. This is where concepts like state variables are defined.

Non-equilibrium thermodynamics.

Statistical mechanics, which started as a way to link the atomic approaches to matter to the macroscopic observations of equilibrium thermodynamics; Boltzmann’s work and the concept of ensembles belong to this field.

Information theory as developed by Shannon.

Philosophy of physics and of information which analyse the concepts underlying all of the above and the methods and issues with approaches to linking them.

Now one could recommend that a fruitful discussion would benefit from a basic understanding of each area. When learning something new, I try to master one basic source rather than dipping into many. The Khan academy has clear ntroduction to thermodynamics (eg for state variables). The basics of Shannon info (eg for the p in -log p) are covered here.

But it would hardly be in the spirit of the thread to recommend that people understand the basics before intermixing the topics. Instead, I am going to add more intellectual fuel to the fire:

Concepts of statistical mechanics and thermodynamics are key to modern approaches to Deep Learning. AJ Maren has a nice overview (her writing style and book advertising can be off-putting, I admit).

Similar mathematics is used in the Predictive Processing approach to perception and action in neuro-psychology; A. Seth’s work mentioned in another thread on consciousness uses that approach (among others).

3 Likes

I am not sure why you would introduce an obscure term from a 50 year old reference.

Reading between the lines, perhaps unfairly, I think what you are really trying to probe into is why we intuitively think “information” is the opposite of randomness which is how Shannon entropy is usually conceived.

In any event, negentropy is often associated with Schrodinger’s paper on life; his idea was previously noticed and nicely summarized by Boltzmann (see linked Wiki)

But as the Wiki article explains, Schrodinger regretted this term and would have preferred to have talked about free energy. Indeed, this article says that by 1980 the usage of the term negentropy with this connotation was frowned upon.

Negentropy also appears with a different connotation here in Wiki. As best I can tell, in this sense it seems to be related to (the same as?) the relative entropy (AKA KL Divergence) between the information under consideration and some appropriately-constrained maximum entropy. For example, in the discrete, unconstrained case (which I think is sometimes used in biological applications), the uniform distribution has maximum entropy. In other cases (eg, I think neural networks optimization) with a fixed variance, the Normal distribution has maximum entropy.

By the way, many say the universe maximizes entropy. But others prefer the equivalent idea that the universe, and life in particular, minimizes free energy. Under that vision, life’s philosophy would be “if you got it, spend it!”.

2 Likes

The analogy I use is if humans can start out as a single cell and develop into a human being over 9 months and that doesn’t violate the 2LoT, then why would the same thing happening over millions of years pose a problem?

I have also seen the same excuses. Some of my favorites include the womb being an area protected from the laws of entropy and the idea that writing down instructions will allow someone to violate the laws of thermodynamics. At some point you have to conclude that some creationists simply don’t want to understand entropy.

I have heard that bad example several times as well. At least Asimov wrote the wonderful essay “The Relativity of Wrong” to make up for it. :wink:

2 Likes

My memory may be faulty but I think that Asimov first published that “teenager’s rush towards disorder bedroom” explanation of 2nd LOT and entropy in a children’s book of science. (For those who aren’t familiar with the breadth of Asimov’s writings, he churned out books in a near torrent. That included books for children at various levels.) In its original context—for a young reader learning about entropy for the first time—I suppose it wasn’t a terrible analogy. But sometimes very brilliant people fail to see how their illustrations and analogy will be misunderstood by more average minds.

I always liked the explanations which dealt with entropy in relation to “energy available to do work.” So when I first heard someone say, “Water would make a great fuel if only we could figure out how to harness its chemical energy.”, I immediately recognized the difference between having a tank of water versus tanks of oxygen and hydrogen waiting to be combined.

True enough. That is a classic which everyone should read. (Actually, I think that essay still appears along with others of worthy importance in reprintings of a classic Asimov book.)

I wonder if The Relativity of Wrong is routinely included in high school science classes. Surely that is a perspective which a science-literate citizenry should understand in order to make intelligent choices.

(Of course, the failure to understand the relativity of wrong seems to afflict a lot of people—such as some forums where people get all worked up about “The earth is NOT a sphere! It is an oblate spheroid.” or something similarly pedantic which isn’t called for in the context. [By the way, I always thought a tiresome example in Gene Roddenberry’s screenplays was the way Mr. Spock would say things like: “No, Captain. The temperature of the rock is 547.23 degrees Centigrade.” It was implied that 547 degrees was “wrong” and that 547.23 was “right.” A truly logic-driven and consistent Spock would have at least expressed the reading along with significant digits of precision.]