Hunter: Finally, the Details of How Proteins Evolve

What is this “entropy reduction” you speak of?

In any case it is inconsequential as a proof of concept of the claim that the astronomical reductions in entropy, which evolution requires, can occur by chance events and natural laws. It seems that LTEE’s main contribution is that it makes evolutionists even more confident in their claims, it that were possible.

Of course. Some of the very same “serendipitous” mutations that contributed to the evolution of the AFGP gene also occurred in the LTEE, such as a duplication and translocation of a protein coding gene downstream of a promoter active under the right conditions.

A locus (the citrate transporter protein) was duplicated into another part of the genome, and it just so happened to end up under control of a promoter that was active under aerobic conditions, and the translocated gene just so happened to be a citrate transporter that could take up citrate, which just so happened to be present in the growth medium. What are the odds all these fortuitous events would combine to allow this?

We are seeing some of the very same things you think should basically be impossible be recorded in real time, and somehow you think this supports your case. It is difficult to think of a way you could bring yourself out of your current convictions when you’re able to spin that kind evidence as somehow constituting support for your views.

I notice that you also keep ignoring my request for you to detail what kind of evidence you think should exist if some protein coding gene evolved de novo from non coding DNA. You appear to have a hard time with the concept If X happened, we should expect to find Y, as is a completely basic element of hypothesis testing.

Apparently nothing short of directly witnessing the event yourself can overrule your a priori commitment to the statistical impossibility of de novo protein evolution. Yet none of your IDcreationistic views could stand under that standard of evidence. Apparently you are entirely fine with inferring that the designer did all sorts of things in the distant geological past on little more than you can conceive of it.

You have a double standard.

2 Likes

A brief followup to this (or, to be clearer, an apology) - after reading through this thread, it occurs to me that I was probably the only one here who was making any sort of connection between the Bock review and antifreeze protein evolution. It’s easy to see where my blinders are pointed, but they sometimes may make for some cryptic posts.

So, to all who have been patient with me here, and especially to @Paul_Nelson, who is far too kind to let his exasperation show through, feel free to quote the following when needed:

“What the heck, Art! How am I supposed to follow that convoluted trail! :confounded::confused:

5 Likes

Hi Art,

No apologies needed. I knew what you were getting at (above) and am trying to process Bock’s data in the context of the antifreeze hypothesis. I just finished my online seminar today on the “function wars” and am trying to catch up with Bock, etc., this afternoon.

2 Likes

Well it takes on different nuances depending on whether you are doing thermodynamics, chemistry, statistical mechanics, information theory, etc. But the point is that we live in a universe that tends toward disorder. If you want to put a design together, you have to work hard to do it. IOW, there needs to be an external agent. Nature doesn’t spontaneously generate astronomical quantities of order and function. Evolutionists don’t appreciate this or the hyper-dimensional search problem they are faced with. LTEE does nothing to address this realism. Entropy is not a scientific theory. This is one of the most confirmed, most established, concepts in all of science. We do not call it a scientific theory, we call it a scientific law, and it demolishes evolution. One argument evolutionists use is that, “well, you didn’t falsify evolution.” You would think it’s a joke, but it’s not.

Congratulations, you have disproved plants growing.

Also, “function” has nothing to do with thermodynamic entropy. What the laws of thermodynamics say is that to get complexity or “order”, you need an input of energy. That’s why you have to “work hard” (and thus convert more readily usable energy into less usable energy), but you’re not the only thing that can do work. You might have heard of this weird thing up in the sky called the Sun. Or how about radioactivity?

By now it is already clear to me that you have absolutely no god damn clue what you’re talking about, and I don’t even have any formal qualifications in physics. Your argument is so bad that even forking Answers in Genesis had to make an article telling their sycophant acolytes to stop making it.

1 Like

Entropy is a measurement of the amount of energy that is not available to perform work. A system that is closed to external inputs will over time have a depleting amount of energy to perform work–i.e., it will exhibit increasing entropy.

Biological systems on the earth (including humans) happen to have an external source of energy with which work can be performed, though. I’m thinking of the Beatles song…

As long as our sun keeps fusing hydrogen into helium, entropy is not a problem for biological systems. AiG indeed acknowledged this very fact on one of their “Arguments to Avoid” pages:

The law [of entropy] allows for increasing the amount of order in a given system, so when applying the law the system being discussed must be carefully defined.

1 Like

LOL! PhD in Biophysics and doesn’t understand how endothermic chemical reactions work. :smile:

I wonder if Dr. Hunter has figured out yet what external agent drives photosynthesis?

Hunter’s post reminded me of the classic Fundy line documented at FSTDT

One of the most basic laws in the universe is the Second Law of Thermodynamics. This states that as time goes by, entropy in an environment will increase. Evolution argues differently against a law that is accepted EVERYWHERE BY EVERYONE. Evolution says that we started out simple, and over time became more complex. That just isn’t possible: UNLESS there is a giant outside source of energy supplying the Earth with huge amounts of energy . If there were such a source, scientists would certainly know about it.

:sweat_smile: :sweat_smile: :sweat_smile:

Depends on one’s perspective.

I have dusted off a very old essay that tries to convey a different perspective. Enjoy, and please, please everyone feel free to comment, criticize, and take the thoughts in new directions.


On complexity and information

Consider the tornadic thunderstorm. It consists of a number of integrated and essential components, all of which are needed to produce and maintain the tornado. The ground and upper-air windstreams (which must be oriented in precise manners), the precisely-functioning updraft, the supercell itself (which consists of more parts than I can possibly list), and the funnel cloud. By most accounts (there will always be dissent), an IC system.

Can we speak about the information content of a tornadic thunderstorm? I believe so. Recall that the informational content of genomes is usually estimated by “calculating” that fraction of all possible sequences (nominally, amino acid sequences) that can satisfy a particular specification. We can use a similar strategy to guesstimate the “information” carried by water vapor molecules in a storm. The hard part is deciding how few of all of the possible states that are available to a particular water molecule are actually “used” in a storm. Now, one can count up all possible positions in the storm, interactions with all possible partners, etc., etc., and realize that the number is probably rather small. But, for the sake of argument, let’s pick an arbitrarily large number – let’s propose that only 1 in 10^30 hypothetical states of any given water molecule is excluded in a storm.

Starting there, we need only count the number of water vapor molecules in a storm and estimate the “probability” that the arrangement found in a storm would occur. If we arbitrarily think in simple terms - a storm that is 5x5x7 miles in size, a temperature of 30 degrees C, a partial pressure for water vapor of about 32 mm Hg, an overall atmospheric pressure of 1 atm - then the number becomes (roughly) 1x10^-30 raised to the number of water vapor molecules in the storm (which is about10^36). Which in turn is about 10^-10^6 (that’s 1 divided by 1 million!). (For comparison, recall Hoyle’s number of 10^-40,000 as an estimate of the probability of the proteins in a simple cell arising by chance.)

Hopefully, if I’ve been clear, there should be the beginnings of a paradox here. This comes from the “universal probability bound” that was set forth by Dembski – roughly 10^-150. The reflexive interpretation of the preceding is that the information content of a tornado, which obviously forms by chance and far too often to be considered as improbable, exceeds Dembski’s limit, thereby indicating a fundamental problem somewhere. But, in exploring this seeming paradox, I would suggest that useful things can be learned about the application of Dembski’s ideas to nature. I’ll offer two in the rest of this post.

A. First, I need to remind myself just what the “universal probability bound” is. It was not derived from information-based computations, but rather by estimating the likelihood of possible occurrences in the universe. The preceding suggests (to me, at least) that it may not be appropriate to equate the information content of a system with the probability of occurrence of an event. This probability needs to be estimated by other means, and depends on much more than the informational changes associated with an event.

As a simpler example, consider the information content of T-urf13. In NFL, Dembski argues that the information content of this protein, while large, is in and of itself below the “limit” that one gets if we equate the “universal probability bound” with informational bits. However, the probability of this protein arising by chance in a population of maize plants is likely far, far greater than the informational estimate would suggest. In contrast, the probability of finding a milligram of T-urf13 on, say, the dark side of the moon is far, far less than the “universal probability bound”. In other words, circumstance and pathway are of paramount importance when thinking of probability, while inherent information content is almost irrelevant.

Put another way, an information content of 3 million bits (roughly that of a tornado, if one grants my back-of-the-napkin arithmatic), while in excess of the limit one would get if one equates the “universal probability bound” with information, is not really complex, since an event bearing the origination of such a quantity of information can and does occur “by chance”, and frequently. IOW, complexity is not determined by information content, but by other considerations.

B. Which brings me to a second point. Usually, (in my reading, at least), information content is reflective of the informational entropy of a system. Entropy, in turn, is usually taken as a state variable – the informational entropy of, say, a protein is independent of the pathway by which the protein originated. The preceding indicates that complexity does not share this property. It follows (at least to me) that the property “complex specified information” (CSI) is not a state variable, and thus should not be rigorously equated with information per se. I would suggest that a better analogy to be used here is that of thermodynamic work. Work is a property that is pathway-dependent – the amount of work obtained in going from state A to state B is determined as much by pathway as the inherent thermodynamic properties of the initial and final states (although the poises of the state variables do affect the work that can be done). It seems (naively, to be sure) that CSI would be better defined in terms of some sort of informational “work”, rather than inherent information content. (This would take into account the pathway dependence of the assignment of complexity, as indicated in the preceding.)

I’d like to elaborate on the concept of informational work, but I am woefully ill-equipped to explore the idea in much depth. Perhaps other participants here can fill in some of the blanks.


As I said, just some fanciful thoughts. Enjoy.

Thanks for sharing the extended quote, Art. I would like to read the passage in its original context, but I have not been able to find it by googling or binging. Could you share a link?

Thanks,
Chris

I know this was a rhetorical question, but it looked like a fun math question.

If we assume all mutations are equally likely then there are 3 possible substitution mutations at each base. The strain of E. coli Lenski used is 4.6 megabases, so there are 13.8 million possible substitution mutations total. The chances of any one mutation is 1 in 13.8 million. The chances of 600 specific mutations is 1 in 13.8 million to the 600th power, or 8.46e+4283. That’s a rather large number and so improbable that most ID/creationists would claim that it couldn’t occur, yet it did.

1 Like

That’s something I wrote a long, long time ago in a galaxy far, far away (for the old Access Research Network boards, IIRC). It is pretty much the same as I posted.

1 Like

That’s not thermodynamics. When you clean a messy bedroom you are not decreasing the entropy of the bedroom by arranging everything in a certain order.

That external agent need not be an intelligent being.

Oh, dear. If that were so, then how do you explain the temperature gradient from the poles to the equator here on Earth? That is a constant increase in order, and it happens spontaneously and naturally.

I would suggest that you become familiar with how entropy works. Perhaps you could show us how entropy is violated by mutagenesis or natural selection.

I’m afraid you are not quite getting it. You are looking in the wrong place. The reason why new organisms can arise and grow is because there is a system in place (environments, populations, reproductive systems, genomes, metabolic systems, etc.). You are claiming that system arose spontaneously, a process that would require astronomical self-ordering. We live in a universe where things tend toward disorder.

Both of which are examples of energy dissipation. Adding energy to a system does not enforce order. The number of unordered configurations astronomically dwarfs ordered configurations, by orders of orders of magnitude. That is an enormous entropy barrier, for which you’ll need quite a running start. Oparin said the origin of life problem would be solved very soon. That was a century ago. Here we are a hundred years later, and it has only gotten worse. One study put the probability of a reproducing system arising at less than 10^-1000.

It has been a century of waiting for the answer that was so confidently expected. It has been a century of head fakes and false claims. Here is this week’s::

NASA Study Reproduces Origins of Life on Ocean Floor

Evolutionists claim the entire biological world arose spontaneously, in spite of the empirical evidence. The theory has repeatedly failed, and it has just gotten worse. There is no explanation for the origin of life, and yet I’m the one who “hasn’t a clue” about what I’m talking about.

So you have moved the goalposts to the origin of life. Weren’t we discussing evolution? Evolution, just like plant growth, is also dependent on the existence of reproducing organisms.

There are literally millions of examples where that in fact takes place. Take salt water, boil it, obtain salt crystals. That’s a highly ordered state in those salt crystals.

Hurricanes are ordered dissipative structures that emerge in temperature gradients when heat energy dissipates from warm ocean waters into the atmosphere. A particularly famous example of an “ordered” dissipative structure is the persisting hexagonal arrangement of clouds on Saturn’s north pole. Why is it there? Basically because the Sun shines on Saturn.

Living organisms are dissipative far from equilibrium structures. That should give you a clue.

The number of unordered configurations astronomically dwarfs ordered configurations, by orders of orders of magnitude. That is an enormous entropy barrier, for which you’ll need quite a running start.

Technically you really just need energy.

Oparin said the origin of life problem would be solved very soon. That was a century ago.

We’ve waited two millennia for Jesus.

Didn’t the rapture happen in 2012? Perhaps we should just be a bit skeptical about grandiose popular press articles in general, and claims of prophecy, wouldn’t you say?

2 Likes

Yeah, the title is misleading. But it’s a popular press release. It’s meant to grab your attention. Compare the popular press article to the actual paper:

https://www.pnas.org/content/early/2019/02/19/1812098116.

The DI has always been good at taking advantage of poor science journalism. As you can see the authors of this study are making no head fakes or false claims. Just looks like more solid work from Russell and co.

2 Likes

This is a valid point that science Journalism and scientific papers are often very different in their claims. The DI calling Journalists out is of value.

Scientific journalism is often shoddy and prone to hyperbole. I was teaching a Biology course for non-majors last year in which I “incentivize” combing through weekly and daily science reports. I had to explain why “scientists find a new organ called the interstitium!” was way off-base.

However, the DI tends to focus on a misleading titles from non-science source as evidence that the science itself is faulty. A vast majority of the time, it is not (see @T.j_Runyon’s example).

2 Likes

That isn’t what they are really doing though. They use the press release to set up their argument and then knock it down with the actual paper and then go, “hah! More misleading and bad science from evolutionists!” When in fact it was nothing but poor journalism.

5 Likes

HAH! Still repeating the same silly Creationist claim and still not understanding/ignoring how endothermic chemical reactions work. Pity.

Gotta throw in the equally silly “it’s too improbable!!” Creationist canard too. :grinning: You’re nothing if not predictable CH.

1 Like