I have not read this article in it’s entirety but what I have read has led me to think it worthy of bringing to the attention of the entire community here. Please read!
The scientific community, so skilled at working within its own discourse conventions, must also concentrate on how to express these notions clearly to non-technicians. The term “rhetoric” has acquired an unfortunate connotation, but the synonymous phrase “effective communication” may be used for a project the academic community must actively engage in as a part of their place in the division of intellectual labor.
The second law of thermodynamics allows for entropy to decrease when energy and work is done to the system. That’s really the only explanation that needs to be made. If evolution violates the 2LoT then refrigerators also violate the 2nd LoT.
Endothermic chemical reactions would also violate 2LoT if Creationist arguments were correct. You never hear of Creationists attacking Chemistry classes though.
But it is not only the understanding of the content of the law that is faulty in the anti-evolutionists’ argument, it is also the scope of applicability of the law.
But where do these anti-evolutionists get these faulty notions from? Both the content of the law and the scope of applicability of the law are misunderstood by both pro-evolutionists and anti-evolutionists.
If you educate the anti-evolutionists the pro-evolutionists will invariably disagree with them.
The second law of thermodynamics does not say that disorder necessarily increases in isolated systems (not adding or subtracting cards) that are not in equilibrium (the cards are being shuffled), rather it says that the likelihood of finding it in its original or any given state tends to approach the likelihood of finding it in any other state. When we understand what the second law of thermodynamics really says, the anti-evolutionists’ misrepresentation of it as requiring increasing disorder is seen as a misunderstanding.
So this would appear to be the “understanding of the content of the law that is faulty” that the author refers to. Though it could perhaps be argued that it is also the scope of applicability if we take the author to be saying that the second law only applies to isolated systems at equilibrium.
But can we really blame the anti-evolutionists?
The second law of thermodynamics is one of the most fundamental laws of nature, having profound implications. In essence, it says this:
The second law - The level of disorder in the universe is steadily increasing. Systems tend to move from ordered behavior to more random behavior.
The second law of thermodynamics can be stated in terms of entropy. If a reversible process occurs, there is no net change in entropy. In an irreversible process, entropy always increases, so the change in entropy is positive. The total entropy of the universe is continually increasing.
Actually, here’s what the author states about the scope of applicability of the second law.
But it is not only the understanding of the content of the law that is faulty in the anti-evolutionists’ argument, it is also the scope of applicability of the law. The second law holds for systems that are thermally isolated and not in equilibrium.
This comes off as a bit strange as entropy is defined for systems that are at equilibrium.
We take as a general thesis that evolution is a special case of the second law of thermodynamics.
Since biological information resides in biological systems and has a physical interpretation, it must be subject to the consequences of the second law.
- E.O. Wilson from Entropy, Information, and Evolution
We can blame the anti-evolutionists for spreading the false claim that entropy can only ever increase in all systems at all times. Apparently, refrigerators shouldn’t work in creation science.
You can blame Santa Claus and the Tooth Fairy too, but should they be blamed?
I don’t see why not. I’ve never come across anyone claiming that refrigerators shouldn’t work.
Yes. They are the ones who spread this false claim.
Creationists claim that entropy can never decrease in any system at any time. A refrigerator starts out at thermal equilibrium with the room it is in. Supply it some electricity and it increases entropy by creating an area of higher temperature and an area of lower temperature. This shouldn’t happen according to creation science.
I think they actually claim that entropy can never decrease unless there was an intelligent designer involved. They don’t have a problem with refrigerators, since they can point to intelligent designers.
The irony here is that entropy is information. So this guarantees that information in DNA is guaranteed to increase.
That’s even more disturbing. If humans could violate the laws of thermodynamics simply by being intelligent then we wouldn’t need power plants or gasoline in our cars.
Entropy is not just defined for systems that are at equilibrium. The authors are correct in saying that the second law is relevant for systems that are not in equilibrium.
Both of the statements you bolded are correct. Because it is extremely unlikely for entropy to decrease, most interactions result in an increasing entropy. This results in a net positive rate for the total entropy of the Universe.
First, let me admit that non-equilibrium thermodynamics is something I know almost nothing about. Any suggestion on an accessible to the lay person discussion of how entropy is used/defined in non-equilibrium thermodynamics?
As to what the authors said, it an be read that the second law does not hold for systems that are not thermally isolated and in equilibrium. It’s just a very strange way of stating the second law, like nothing I’ve ever seen before. But that may not mean anything:
In a recent thermodynamics text, Truesdell (1984) identifies several different “Second Laws,” and the physicist-philosopher Mario Bunge (1986, p. 306) compiled a list of 'twenty or so ostensibly inequivalent but equally vague formulation of ‘the’ Second Law."
- David L. Hull
Would you agree that there are many formulations of ‘the’ Second Law and are some of them false and misleading? Perhaps start with whether the Second Law is even a law.
Are you open to some suggested reading?
Any undergraduate textbook on statistical mechanics will define entropy in the usual way: S = k ln Ω. Since this is just a counting of the number of microstates, obviously this definition is not limited to non-equilibrium thermodynamics.
If you want to understand the 2nd law and entropy production/loss from a non-equilibrium thermodynamics perspective, you can start with the wikipedia article for the Fluctuation Theorem. That article is sparse on the mathematics, and should be understandable to a layperson.
It depends on what you explicitly mean by “many formulations” of the Second Law. I can’t find a copy of the Truesdell (1984) book to see what David hull means by “many formulations”, and Bunge (1986) is just a book review of Truesdell (1984).
This is true. But statistical mechanics also introduces ensembles, such as the canonical ensemble. It is these that define the entropy.
Aren’t we still talking about equilibrium thermodynamics? Can you help me if I am missing out on something or improperly stating it?
Unfortunately you are missing something. The canonical ensemble is not necessary in the definition of entropy. We can go into more details if you want, but today is All Saints Day and as a good Catholic boy I have to go to mass right now and won’t be able to reply for ~2 hours.
That always seems to be the case.
Something for when you get back. Let me know what you think.
The concept of entropy, in particular, is central to thermodynamics. Entropy tends to be confusing because it does not have an intuitive connection to mechanical quantities, such as velocity and position, and because it is not conserved, like energy. Entropy is also frequently described using qualitative metrics such as “disorder” that are imprecise and difficult to interpret in practice. Not only do such descriptions do a terrible disservice to the elegant mathematics of thermodynamics, but also the notion of entropy as “disorder” is sometimes outright wrong. …
In reality, entropy is not terribly complicated. It is simply a mathematical function that emerges naturally for equilibrium in isolated systems, that is, systems that cannot exchange energy or particles with their surroundings and that are at fixed volume. For a single-component system, that function is
S = S(E, V, N)
which states that entropy is dependent on three macroscopic quantities: the total internal energy of the system E, the total volume of the system V, and the number of particles (molecules or atoms) N.
- M. Scott Shell. Thermodynamics and Statistical Mechanics. Chapter 2. Equilibrium and Entropy. p. 8
What does it mean to say that entropy is dependent on macroscopic quantities?
Why doesn’t the Boltzmann equation you mentioned reference these macroscopic quantities?
What does it mean to say that entropy is a state function?