What is the ID Definition of Information?

This is a nice easy going discussion about entropy. @mung can learn from this. Thanks!

:wink:

2 Likes

I can help do so, but at the moment I am barely finding time to check in here now and then, much less commit to what seems like a large task.

1 Like

A quiz for @mung How many bits of information is contained in Shannon box (he made this box himself)? How many states (W) can Shannon box be in? How much entropy (S) does the box contain? S = log W

1 Like

I love quizzes!

There are zero bits of information in the box.

Wrong. The switch can be in one of two states - on or off so W=2 and log 2 = 1 bit of information in the box.

Do you realize that this box is a museum piece?

1 Like

The box appears to be superfluous. The information is in the switch and the switch is in the box therefore the information is in the box?

ETA: There’s no information in the switch either.

information is not physical.

1 Like

At least for quantum information, Scott Aaronson would disagree; see here: Is “information is physical” contentful?

To understand his argument properly, you are best to read the whole linked post. But at the risk of earning the derision of those who know the physics, let me try to summarize it to pique your interest. Of course, you must look at the post for details and if you want to criticize the argument based on the details.

Aaronson starts by saying we can consider quantum information as real because:

  1. whether we have recorded such information affects the outcome of interference experiments
  2. Bekenstein showed that quantum information takes up physical space; try to pack too much into a given volume leads to a black whole.

He recognizes a skeptic may claim there is a category error somewhere; how can information influence the world? He give a 12-step deductive argument to justify it does which I summarize as:

  • Physical information varies across space.
  • SR tells us that anything that varies across space varies across time for some observer.
  • QM says anything that varies across time carries energy.
  • GR says anything that carries energy warps spacetime.
  • GR further says that spacetime only can be warped so much before it collapses into a black hole.

So our best physical theories confirm that quantum information takes up space and has physical effects.

Sean Carroll adds an important philosophical point in the comments. He points that that information is like energy in that we can specify physical laws without those concepts. But it is much more convenient to include them. His argues that that key role justifies us including both of them in the ontology that our best physics provides.

I wonder how that ontological principle applies to Shannon information as used in biology. Is it a key part of stating any biological results? Or is it a simply a concept applied after the fact but with no core role in the science?

I’m pretty sure information isn’t physical in the same way that numbers aren’t physical. Don’t you agree @BruceS?

1 Like

That’s not exactly true, is it? There is a minimum amount of energy needed to “flip a bit”, implying a physical connection.

AFAIK: this connection does not preserve information.

Let’s say that the “bit” represents the answer to a yes/no question. Let’s say that the question is, Is Information physical.

From the fact that it takes energy to flip the bit does it follow that information is physical? Isn’t the information in the answer to the question and not in the flipping of the bit?

So I think more has to be involved than just energy and bit flipping. The flipping of the bit must in some way involve meaning. Information is always information about something.

Information is just a count of the logarithm of the possible states a system can be in. S = -log W I don’t see how the count of anything is physical.

I think (possibly wrongly) that there is energy bound up in the representation. Energy is not a count, so if I am right, then it’s still not identical to information.

Is there a physicist in the house?

How does this relate to the decade long debate between Hawking and Susskind over the preservation of information at the event horizon of a black hole?

1 Like

It’s often argued that abstract entities like numbers cannot be considered real because they do not have causal influence on spatio-temporal reality.

On the hand, in the article I linked earlier, Aaronson argues that in its role in QFT/GR, (quantum) information does have causal impact. So a similar argument to that used against numbers would not apply.

I’m not qualified to assess the quality of Aaronson’s argument; I’d suggest a call-out to the physicists if you think it would be of interest to them.

Sean Carroll, early in the comments, involves the philosophical principle that any entity which plays an essential role in our science should be part of our ontology (it’s basically Quine’s approach). He says that information plays just as fundamental a role as energy, so if you consider one as part of the ontology science dictates, then you have to include the other.

I’m fine with the philosophy, since I am a scientific realist. But whether it is fair to claim that information is just as indispensable to physics as is energy is another question best left to physicists or philosophers of physics.

I did not mention it, but Aaronson extends this idea to the information content claimed for thermodynamics reduced to SM. He does not appeal to Landauer’s Principle, which is what I am used to, but instead I read him as extending the argument used for quantum info.

As a side note, I skimmed the other thread about Math and Physics. It too seemed to at least allude to Qune’s view: numbers are essential to our best science, hence we should include them in our ontology (Quine went with sets instead of numbers, since numbers can be defined using them).

1 Like

Another one best left for the physicists. My C$ 2 cents is that the black hole paradox relates to the calculations that imply information is lost when something falls into a black hole; if so, that contradicts the physics principle of conservation of information.

That seems to imply an assumption that information is physical. But since Aaronson did not reference it to butterss his argument, I’m no doubt missing something.

Which probably went without saying.

1 Like

One of the problems here is that there are many meanings for “information.”

The way I prefer to use it, “information” refers to something semantic. And that is necessarily abstract. It turn we encode the semantic information into syntax (which I would call “syntactic informatiom”). And the syntactic information is like numbers, abstract, not physical. And then we encode the syntactic units into something physical – actual physical of some kind. And I would be inclined to use “physical information” to refer to those physical representations of the syntactic information.

What physicists mean by “information” seems to be closer to what I would call “physical information”.

I am probably butchering this, but from what I remember in a Susskind piece . . . information is the current state of matter, energy, space, and time that can be used to determine what those things looked like in the past. For example, if you had perfect knowledge of a glass of water containing a diluted dye and the environment around it you could map the molecules and forces and reconstruct exactly where the drop landed in the glass of water, and when. It is this type of information that Susskind thinks is preserved in nature, including on the event horizon of a black hole.

2 Likes

How do we understand the word ‘about’? As Neil points out in his post, it’s easy to fall into the trap of a semantic interpretation which takes information as adding knowledge to some intelligent agent (or maybe as adding surprise taken as a human reaction).

But that is not the correct way to interpret ‘about’ for scientific applications. Instead, we have to think about how the mathematics of the relevant type of information is interpreted in the scientific theory. Here interpreted implies a formal mapping between the elements of the mathematics and those of the relevant science.

So consider Shannon information. It involves a random variable with a sample space and probability distribution on that space.

Suppose we ask what Shannon information is “about” when applied to genomes. The answer could be that the sample space is the a sequence of letters AGCT and the probabilities are assigned from the mechanisms of biology*.

Or suppose we ask about thermodynamics reduced to Statistical Mechanics. In this case, the sample space is microstates and the probabilities depend on what type of particles constitute those states and is supplied by the physics of those particles.

In the case of the Bekenstein bound mentioned in the Aaronson article I linked, I believe the mathematics of quantum information is needed. That is, qubits and von Neumann entropy. These have direct interpretations in QFT. There is then a link to standard Shannon info bits through measurement, but I think the richer concept of quantum information is needed for the correct statement of the bound. Reminder: I am not a physicist.



*Of course, consensus science and ID theorists disagree on what probability structure correctly represents the relevant biology.

1 Like

For me the Shannon measure is always about the probability distribution. So, for example, I would not be looking for Shannon information in the genome, or Shannon information about the genome. As a crude analogy, if you read a thermometer stuck into a turkey or a pie in the oven does temperature become about turkeys and pies?

2 Likes