Side Comments: Is there really information being conveyed within a cell?

Just to be clear here, is there a direct equating of information with entropy in information theory and/or computer science? Shannon’s entropy is rightly called entropy, because it is just the same as Boltzmann’s, but I would hesitate to say either is the same as or even monotonously increasing with “information content” (how ever exactly one would quantify that). Here is why:

Entropy, in both formulations, is a measure of a macrostate’s phase space volume. In statistical physics, we could think of the macrostate as a tuple of state variables, like temperature and pressure distribution over the volume of a thermodynamic system. There are many (many) ways in which all of that system’s particles can be arranged in terms of their actual positions and momenta (microstate) to arrive at the same values for these macroscopic variables. All of these microstates “look the same” after a very conservative amount of coarse graining. Even if we insist that there is a particle at every position-momentum locus that we have captured in a snapshot, because the particles are interchangeable, there is still N! permutations to get exactly that state, and particle numbers N are on the order of 10²⁰ to 10³⁰ in realistic lab settings. Still, some macrostates correspond to more microstates than others. Given random fluctuations, it is then vastly more likely that a system should progress into a macrostate that occupies more of the phase space, than into one that occupies less. This, in the understanding of statistical physics, is the second law of thermodynamics, and at no point in this explanation was there any need to introduce “information”, nor would it have helped if I did.

In computer science, if we are going to define an amount of information by the length of a string of bits, we have done nothing to begin talking about entropy. The number of bits is the number of particles, and if we impose that it cannot even change, of course the amount of information won’t either. The microstate is what ever specific values all the bits have, and if we decrease our resolution of this picture far enough to where we can no longer tell which of two neighboring bits is red and which is blue when looking at a purple locus, now we are beginning to get towards talking about statistical mechanics again. If in every time step a bit is randomly chosen from an even distribution for the flipping, the flucutations of colour between locations will even out in the long run, and the shade of the system will be a mix of red and blue corresponding to the initial ratio of red vs blue bits. There are many more ways of producing noise that looks like that under a coarse enough graining, than there are ways of producing noise that has a lot of highly contrasting regions. This number-of-ways to get the macroscopic outcome, that is entropy. Again, there was no need to even mention information.

But not only is equating entropy with information unhelpful, it can, under quite reasonable interpretations, also end up misleading:

Consider the blurring process of bits described above. Sure, the information kept within the exact microstate is constant, but if we are talking about 10²⁰ bits, it becomes impractical to store them. We could instead store a string of region lengths, where a region is a contiguous string of bits that have the same value. In that sense, as entropy increases, the amount of those regions rises. It becomes more expensive to keep the information about the microstate, than had the regions been long as they were before the diffusion. Of course, we are still storing the entire microstate, just in a different format, which eventually becomes as impractical as storing all the bits in the beginning. So how about we do the coarse graining we normally would in physics, and sample large chunks of our bit string and store some statistical information about that chunk. A hash sum, of sorts. Well, if our bit strings are very long, then in equilibrium the hash of one chunk will have the same value as the hash of another. Their distinctiveness only exists far from equilibrium, and it decreases as we approach equilibrium. Knowing the equilibrium state we know almost nothing about the initial state anymore. Entropy has increased between initial and steady state, but in the process the state has become unspecific, and information about its past is altogether erased. And it’s not that our storage scheme has expended information now hidden away in the system. Recovering the correct initial state from an equilibrated one is outright impossible – the Gauss kernel smooths out that information until there is none left in the long term limit. That information is genuinely erased from the system, and any storage scheme that would retain it would have had to keep catching it as it was “evaporating” out of the actual string of bits.

In my opinion both understandings of information are reasonable and with no obvious favourite over the respective competitor. Where the former is essentially identical with entropy, the latter is essentially its inverse. This, I find, makes the introduction of the term without explicit definition muddying, ontop of unnecessary. I understand that the topic of this thread is about information flow, but I find it nevertheless hasty to just pull in entropy as a stand-in, when entropy does have a consistent and unambiguous intuition-independent definition and interpretation accross different fields. If the same can be said of information, I’ll be glad to hear it. As far as I know, there is a lot more debate and ambiguity over that term than there ever was about entropy.

1 Like

Yes they are the same quantity in different units: bits in computer science and Joules per degree Kelvin in thermodynamics.

There is no such thing as a “degree Kelvin”. The reason I ask was also not that I was hoping to read the claim re-stated, but because I was struggling and failing to find a confirmation for it. Still, thank you for attempting to help.

degrees Kelvin is temperature. 0 degrees Kelvin is absolute zero. If you are struggling with information theory and entropy try reading Shannon’s paper https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf

What you are referring to seems to be the “kelvin”, \mathrm K. The kelvin is a unit of (thermodynamic) temperature. There used to be a “degree Kelvin” before 1967. There is not anymore.

I was not struggling with information theory or with entropy. What I was struggling with was confirming that information and entropy have identical definitions in information theory. While the paper you link does not confirm that they do explicitly, in that it does not provide a definition of “information” one could compare against the definition of entropy it provides, at least there seem to be some expressions in this work, by which Shannon suggests that his entropy H is a more or less direct measure of the amount of information acquired when parsing a string of bits. For the purposes of this discussion, this is satisfactory. Thank you.

What are you trying to say now?

Speaking of missing fossils, did passenger pigeons exist before they became extinct? Has anyone found a fossilized one?

We EXPECT all these discontinuities. It’s basic geology.

4 Likes

It’s hard for me to avoid the conclusion that, in many instances, those making these arguments realize the argument holds no water, but also realize that the targets of the con won’t know or care. Those targets being the likes of @colewd and @Giltil.

Similarly, is it really possible that no one in the Trump Administration realizes what an egregious lie it is to claim that Ukraine started the war with Russia? But they are going with it, regardless.

4 Likes

Heh. Someone on facebook complained about the article that documents Trump’s over 30.000 lies or misleading statements during his first term as president, saying it would amount to an implausbly large number of lies/falsehoods per day.

Meanwhile Trump can lie anywhere between 1 and 30 times per twitter/x or “Truth Social”(:face_vomiting:) post.

Count them.


Literally every single sentence but one, in that post, is a lie. You do have “a big, beautiful O[sic]cean as separation” that Ukraine and the rest of Europoe doesn’t have.

Every other sentence contains one or more lies. I think there’s like 20-25 total in that post. He posts all the time on social media. He rants like a madman at press conferences. That’s how you rack up thirty thousand lies in 4 years.

4 Likes

This is wishful thinking, as the examples of Gunther Bechly and Anthony Flew show.

Of course it’s an exaggeration that the argument can have “…no traction, ever”. There’s always someone too badly-motivated, biased—or sadly on occasion too stupid—to see through the smoke and mirrors.
Let’s recall that the world also contains flat earthers, geocentrists, and even believers in homeopathy, to name a few other similarly ridiculous propositions already known to be false.

With respect to the two people of your choice, Gunther married into religion and then curiously lost his mind on the topic.
It’s a well-known and rather common occurrence that entereing into a romantic relationship will make people drop their previously held principles and completely change their religious views to that of their newfound partner.

And there’s good evidence that Flew succumbed to his fear of death when he got old.

Both are stories as old as humanity. Neither testify to any merit for the long-demonstrated-false arguments from biological information.

5 Likes

As you know, @Puck_Mendelssohn was referring to this specific laugh-out-loud non sequitur:

“something in biology is information and/or a code, therefore it has to have been put there by an intelligent agent”

I didn’t know that Bechly or Flew embraced that claim. Can you show us where either of them did?

6 Likes

C.S. Pierce, a giant in the field of semiotic, splits signs into three categories—Icons, Symbols, and Indexes—based on how they connect to what they represent, with:

  • Icons: Signs that represent by “likeness” or “resemblance” (he uses a portrait as an example).
  • Indexes: Signs that represent by “physical connection” or “causal relation” (like smoke for fire).
  • Symbols: Signs that represent by “convention” or “habit” (like words in a language).

According to this typology of signs, some of your images represent genuine symbols, others icons, whereas for some, it seems more difficult to decide whether they are symbols or icons. But the interesting point here is that tree rings are unambiguously indexes, not symbols, whereas codons lean strongly toward symbols.

We will never know what Pierce would have thought of codons, as he died decades before anyone conceived of them.

What vocalizations we assign tree rings, or codons, does not matter.

2 Likes

Codons would obviously be Indexes since it is by physical/causal connection through the properties of the translation system that they can be taken to represent amino acids.

Did you even read your own link?

symbols are defined by culture and do not need to resemble their mental concepts. There is no reason why blue, for example, is used to signify masculinity. It is simply tradition and convention.

By that definition the genetic code would not have any symbols, since the genetic code is not defined by culture, traditions, or convention.

Yawn. You really have too many posts on this forum that make these ridiculous appeals to authority or credentials.

Gonna deal with the evidence at some point instead of this endless cycle of appeals to dictionaries, definitions, and credentials?

2 Likes

Hardly. They seem to unambiguously be indexes:

1 Like

Why? How does a codon meet the definition of a symbol?

1 Like

Charles Peirce, an American philosopher writing in the 1800s, categorised the signs we use to communicate ideas with each other into three types: icon, index and symbol.

Genetic codes are not used by us “to communicate ideas with each other”, so clearly do not fall into this schema. Trying to erroneously shoe-horn them into this schema would seem to produce similarly-erroneous results. Garbage In Garbage Out.

2 Likes

Yes, and? It’s what I was challenged to do. Encode a message using tree rings. Just because I created this code doesn’t mean any and all codes require intelligent design.

Penetrating rebuttal.

So what?

A code doesn’t have to be actively used to be a code. If morse code fell out of use it wouldn’t stop being a code. Completely irrelevant response.

It’s also irrelevant whether the code’s principles are easily elucidated or require a long period of investigation and research to comprehend. Makes no difference to whether it is a code or not, nor does it affect whether that code required intelligent design.

All your responses are blatant non-sequiturs. Deal with the evidence.

1 Like