What Is Entropy?

Please take a moment to think about what you think entropy is. What is entropy really?

Please read the following article.

What did you think entropy was before you read the article?

What does the author of the article say entropy is?

Did the article help you understand what entropy is, and how?

Did the article help you understand how information theory is connected to thermodynamic entropy and what in the article helped in that regard?

2 Likes

I found it helpful to understand the connection to physics. I never get to see that side of the math, so the physical interpretation is not intuitive for me.

1 Like

I tend to have a more practical view in my work in molecular biology. I usually use the “entropy is the ability to do work” definition and it gets me by, whether it is cooling agar plates or driving reactions with higher energy phosphates (e.g. ATP).

1 Like

I liked this comment from the article:

So there you are, that’s my little tour of entropy. I hope you are feeling suitably confused.

2 Likes

Entropy is not relevant to Evolution…let alone not relevant to Geneal.Adam.

I imagine most of heard the following anecdote. But, still, how can anyone claim to understand entropy better than von Neumann? I mean, the guy has a type of entropy named after him!

FWIW, I understand the article is targeted at people who know little about entropy in its various guises, and with that proviso it does a reasonable job of explaining the various concepts and how they are linked.

Plus it takes a stab at differentiating surprise (AKA surprisal), missing information, and entropy. Though not all might agree with how the author does that.

2 Likes

Of course, I would be one of those people. :slight_smile:

Take the game of craps. I am no more surprised when a seven is rolled than I am when snake-eyes is rolled, in spite of the difference in probability between the two outcomes. Now what would surprise me is if snake-eyes repeatedly came up six times as often as sevens.

You may enjoy this:

Huh. Did you ever here of Lewis’s principal principle? (Edit: improved philosophical reference)

In any event, it’s a good thing DVD makers and telephone companies were not relying on you to invent a theory of information.

1 Like

@mung I would like to invite you to New Jersey to a private craps party I am hosting in your honor. You can bet on snake eyes and I will bet on seven’s all day long. How much money do you have to test your understanding of probability? :sunglasses::rofl: Note that the probability of snake eyes is 1/36 and the probability of a seven is 6/36 so I should clean you out before lunch. But since I am a good sport, I will buy you lunch as you won’t have any money left.

Silly Patrick :slight_smile:

We each bet a dollar. You give me thirty-seven dollars if snake-eyes comes up. I give you six dollars if seven comes up. I’ll be surprised if I run out of money before you do.

This mixes in an expected value and a (-log) probability concept. Was that intentional? I know, I probably shouldn’t ask. And I guess Patrick sort of started it.

I edited my first post responding to yours because I originally referred to something similar to Patrick: the Dutch book argument, which says you should construct your subjective probabilities so that they satisfy the probability axioms. Under than constraint, I think it is a consequence that in a series of hands of a game of chance, you should align your probabilities (and bets) to the overall probabilities of each hand.

But if there is just one trial, then that argument would not apply. That’s where the principal principle comes in.

Still, if we use “surprise” to refer to a personal feeling, you can certainly violate that by your personal feeling of surprise. I think you are saying that your personal feeling of surprise is a step function, not something continuous as is -log p.

But then using ‘surprise’ in that way for Shannon’s work is the same as using ‘information’ in its colloquial sense in the context of that work. That’s why surprisal is a better term. Like ‘entropy’, its not so easy to anthropomorphize.

Too bad von Nuemann didn’t think of that word. too. You might of thought he would have. After, he was a calculating machine!.

1 Like

I think he does present it as a personal feeling.

I think my point is that ‘surprisal’ seems like something subjective. And of course “average surprisal” something utterly mysterious. In his dissatisfaction with “disorder” he’s replaced it with something equally unsatisfactory (imo).

It [disorder] has a level of subjectivity that the other physical quantities don’t.

Given some probably of the occurrence of an event, should you and I both be surprised by the same amount if that event takes place?

Note: There are of course, as I am sure you know, those who interpret Shannon entropy as expected value. Whether they do that with the individual probabilities of the individual events I’d have to go back and look.

My point to Patrick was that he was proposing that we wager without saying what we would bet or be paid. Is he is saying that we should wager based on my lack of surprise at seeing a snake-eyes or based on my surprise if snake-eyes shows up more often than a seven. If the latter than i would truyly be surprised and would gladly pay him to see it, but request that the game take place in a Las Vegas casino with casino dice . :slight_smile:

I couldn’t help but notice the formula - log ( p ). It looked suspiciously familiar. It was missing an H = on the left hand side of the equation. Is that because he and @swamidass are talking two different things and just happened to be using the same notation?

In his Figure 3 can we replace the S on the left with an H? It is, after all, Shannon entropy, and Shannon used an H. Am I mistaken?

So we have H = - p1 log(p1) - p2 log(p2) - … - pn log(pn) = -log p

Do I have that right?

Putting my IT security hat on: entropy is a measure of how long it would take someone to crack your password.

I agree that the -log p is motivated by referring to subjective/personal surprise. I think it is a fair approach for the intended audience. But it can be misleading in a formal setting. That’s why I’m happy when the term ‘surprisal’ comes across as less subjective.

Yes Shannon used H in his 1948 and it’s the same formula as the average surprise equation in the paper. But the -log p is the surprise which is also called the ‘information gain’

The article is imprecise in how it uses “information gain”. Figure 1 says information gain is -log p (ie surprisal). Then in the section “Average Surprise”, the “Average Information Gain” (emphasis added) is defined as the usual Shannon entropy sum. But in the sentence following Figure 2, the paper says “what we are talking about is how much information we have before and after the experiment.” Does that ‘information’ refer to a single experiment as in initial definition of information gain as surprisal? Or does it refer to the average information gain as in the immediately preceding Figure and definition of Shannon entropy? Or both?

Some people may be disturbed by that seeming imprecision. In fact, given how some people get emotional when they perceive internet as having a mistake, the existence of a thread devoted to arguing about that imprecision, perhaps on this very forum, would not be a surprise.

2 Likes

Given which character set?

@Mung

There are two things in error in the Entropy article:

  1. Evolution can be in any direction… and still be evolution. Evolution does NOT require an increase in complexity. In fact, except for the obvious 1 cell vs multi-cell kind of distinctions, there isnt even a way to reliably MEASURE complexity

And 2), when the sun shines energy down on a planet for 5 billion years (for our purposes, an eternal SOURCE of energy)… Entropy is not a relevant objection to Evolution.

1 Like