Information in 10 Coins on a Desk

Shannon paper had nothing to do with thermodynamics. Thermodynamics was a mature science by the time Shannon wrote his paper.

Shannon paper is a mathematical treatment of communications. It was the foundation of Information Theory.

Information Theory and Thermodynamics are linked by the concept of entropy which is exactly the same in Thermodynamic and in Information Theory.

S = - k ln Ω
H= - p log p

S and H are the same thing measured in different units.

Excellent! I do believe that is what I have been saying all along. p has nothing to do with the content of the message, It has to do with the probability that the given message would be selected given a probability distribution of possible messages.

As such, H is not a measure of “information content” of a specific message. So what is H a measure of? Can we agree that H is a measure of something to do with the possible messages rather than the “content” of any particular message?

When someone writes -log p, that is a reference to the probability of a specific outcome -log pi. That’s the way I interpreted Joshua’s comment. Are you agreeing with my interpretation of his comment?

No shit. p or W are just an indicator of the state the system is in. You can assign or calculate the probability of that state.

Yes, H is the measure of the information content of a specific message.

No we can’t agree. H is the value of the entropy of a particular message or present state of a system.

By taking the log of a particular state we are assigning a number in a particular set of physical units to the state or the message. If the log is in base 2, the units are bits.

I absolutely agree with you. Information Theory provided a different perspective on how to conceptualize the meaning of entropy. And of course, this extended to how we could could conceptualize the Second Law of Thermodynamics. This was all presaged by Boltzmann. The 2LoT does not preclude improbable states.

Try this.

I have a coin on my desk here. How many states can it be in? Either heads or tails. How much information contained in the state of this coin? Answer : log 2 = 1 bit. How many bits of information do I need to send to you to tell you the state of the coin? 1 bit

Now I have a hundred coins on my desk. How many states can these coins be in? 2^100 different states. How much information contained in the states of these coins? Answer log 2^100 = 100 bits. How many bits of information do I need to send you to tell you the present state of these 100 coins? Answers 100 bits.

1 Like

No, Information Theory does NOT provide a different perspective on the meaning of entropy as it is the realization that entropy and information content are the same quantity. Note that E= mc^2 means that matter and energy are the same thing.

Conceptualizing the Second Law of Thermodynamic in the language of Information Theory is not very useful.
If I quoted the output of my furnace in bits instead of BTUs, it would be confusing.

It doesn’t include impossible states. improbable states - yes impossible states -no log 0 not defined.

1 Like

Can the coin stand on it’s edge such that the state it is in is neither heads nor tails? Can it be a two headed coin? Can the experiment of tossing the coin be biased towards heads or tails? Perhaps it is not a “fair” coin toss.

You repeat here the same error you made when talking about a message consisting of only two characters, o and 1. In order to measure the “entropy” you have to know the probability distribution.

Can we agree about that?

No, I won’t allow that. I designed my system so that all the coins are laying flat.

No, I don’t have any two headed coins.

I am not tossing the coins. They are laying here on my desk, undisturbed for all time.

All fair coins, untossed.

So if I told you that all 100 coins are heads would you be surprised?

I hope not because I just told you the state of the coins was all heads. I can send you a message of
a hundred bits in length to completely describe the present state of the coins.

How many possible states of these 100 coins? 2^100 which is a huge number. If I picked up these coins and layed them down again a different state would appear. I would need to send you another 100 bits to tell you the state of all the coins.

2 Likes

Irrelevant, You told me they were all heads. You told me that there was no coin tossing involved. Probabilities don’t enter into it.

1 Like

No, I would not be surprised. Why should I be surprised?

2 Likes

I told you the state of the coins at a particular time. Probabilities doesn’t enter into it. All heads is just one of 2^100 possible states. All heads is no more rarer or improbable than any other state. All states have an equal probability of occurring.

3 Likes

That is correct @Patrick.

Thought it goes to far to say that Shannon’s paper has nothing to do with thermodynamic. He does note the mathematical equivalence.

Also, if we include the effect of a compression algorithm, we have to consider the expectation of the message size, but the case you are explaining, the answer is exactly right.

@mung if you can’t understand why he is correct, you are missing something big.

Probability does enter into it. All states are equally probable, by definition of the problem as you laid it out.

Put another way: A shuffled deck of cards has the same thermodynamic entropy as a deck ordered by rank and suit… But there’s a big difference between both at a poker table.

2 Likes