The Information Paradox of Black Holes

The point of my previous argument with the box is that black holes are indeed just a “boundary condition” in the sense that we cannot measure the whole system. However, we know exactly what happens to entropy given such a boundary.

No. Sabine is talking about the information paradox. As I mentioned before:

We are not there yet! First we have to agree on the 2nd law generalized to black holes before we can say anything about the information paradox. Also, I will repeat, the thermodynamic formulation of horizons is not unique to black holes, but to all causal horizons in general relativity. Not all causal horizons are equipped with a singularity (and neither is every singularity equipped with a horizon, but that’s a different matter).

By the way, I also have a lot of things to say about the Maudlin paper that Sabine was complaining about in the post, my complains are different from Sabine’s, but I think it is safe to say that many physicists do not like that paper.

1 Like

Before I waist anymore of your time, I need to read up some. Do you think this is a good source? If not what can you reccomend?

I have not read this particular post, but Prof. Strassler is usually a good source.

1 Like

“The equation S = k log W + const appears without an elementary theory — or however one wants to say it — devoid of any meaning from a phenomenological point of view.”

— Albert Einstein (1910), popular 2007+

Do you have any comment on the position Einstein shared regard Boltzmann’s entropy equation?

1 Like

Is Einstein’s view in 1910 authoritative on current understanding? He did not get around to accepting Quantum Mechanics, because God doesn’t play dice. Should we now reject QM?

Of course not, and I’m hoping that is not what you are arguing here.

In classical statistical mechanics, the important quantity is not the entropy S, but the number of states W. All of statistical mechanics can be rewritten with W instead of S.

Entropy is just a change of variable to make W easier to work with. All changes of variables are just mathematical processes that is “devoid of any meaning”. This makes perfect sense if that quote is truly Einstein’s, as in his theory of general relativity, the idea that changes of variables are devoid of physical meaning plays an important part.

This does not mean that W and S are not real things. The nature and ontological reality of W is pretty clear. What is devoid of any (physical) meaning is the switch to S. Whenever you see S in classical statistical mechanics, feel free to think of W instead. This does nothing to change our current understanding of statistical mechanics.

As to why entropy, S, is used instead of W, it is just because the equations are easier to work with if one work with extensive properties. In order to turn W into the extensive S, observe:

Two systems with number of states W1 and W2, when combined, will have number of states W1 times W2. However, extensive properties add instead of multiply when two systems are combined. Taking the logarithm gives S this desired property. This motivates the logarithm as the change of variable.

2 Likes

That helps. How is this practically used and measured?

For simple systems, such as the quantum systems used in producing lasers, the number of states are simply related to the number of quantum states in the atoms that produce the laser.

An even simpler system: two coins that can be either face up or face down have 4 states, (up up), (down down), (up down), and (down up).

For simple systems such as these, the number of states can be counted explicitly using combinatorics techniques (or, as with the two coins example, by simply listing every possible states).

For more complicated systems such as the ideal gas, the idea is the same, but the techniques used to explicit count the states are more difficult. For the ideal gas, this results in the Sackur-Tetrode equation.

1 Like