9\imath . Gluten-free hosts have a metaphysical determinism problem through transubstantiation.
@PdotdQ, can you elaborate one what a “predictability” problem is?
I think it is most clear if I contrast a predictability problem and a determinism problem.
First, in physics typically one has some initial conditions (a misnomer, it doesn’t have to be initial), and one evolves it in time through an evolution equation.
In a sort of handwavy language:
-) A determinism problem is when the evolution equation itself does not prescribe what happens next.
-) A predictability problem is when in principle the evolution equation will prescribe what happens next, but using it to predict phenomena is frustrated in practice. For example, in special relativity, it is impossible to get the initial conditions required to evolve the system.
Also, I need to give a clarification (will edit above post) that QM can be made deterministic, but you have to give up locality.
Edit: I thought of perhaps a better description of the difference between a determinism and predictability problem:
-) Deterministic problem - evolution equation cannot evolve initial condition in time
-) Predictability problem - evolution equation will happily evolve initial condition in time, but one cannot predict what the result of this evolution is
Thanks, this is very insightful. Can you explain a little bit more on the following:
Can you give examples of such systems with indeterminable initial conditions? When learning SR, we often encounter solve standard kinematics problems that give us the relative velocities and positions of the objects in the system, which are enough to fully determine their trajectories in space-time. In that way I can’t see anything philosophically interesting about SR compared to classical physics with regards to determinism. So I suppose you are referring to something more complicated than this?
Can you explain on this as well? In the traditional Newtonian machine-like picture of the world as I understood it in principle everything ultimately proceeds from simple kinematics. Laplace’s demon is an illustration of this principle.
By the way, I came across this paper by John Earman (Aspects of Determinism in Modern Physics) which seems very relevant to the discussion at hand. Will be on my reading list soon.
@swamidass Thanks for moving the discussion to a separate thread, but after we sort this out I’d like to tie this back in to Eric’s original deterministic definition of teleology. From the fact that determinism in modern physics is far from given, I think there are significant problems for that definition.
This is a good introduction to this topic and I would recommend this paper to be in every physicists’ reading list. This is written for a more general philosophy audience, so non-physicists should be able to follow the arguments as well.
This is much easier to explain with pictures, but I will attempt.
The idea is simple - one does not have access to a large enough slice of the initial data surface to predict an event in SR. Here is why:
Imagine a spacetime diagram in flat spacetime. To predict the event at a point P, one has to collect initial data at a surface that is within the past light cone of P. However, this is impossible anytime before P! You can convince yourself by drawing on a flat-spacetime diagram a null cone emanating to the past from P. Now, imagine you are at some event Q that is at the past of P. The only slice of the initial data surface that Q has access to is smaller than the one required to predict P (the past null cone of P).
Classical mechanics without special relativity is thought to be non-deterministic since at least the 80s. However, it was proven by Jeff Xia in 1992 that one can produce through a gravitational interaction of 5 bodies, an acceleration where an object is flung to infinity within a finite time.
Consider the inverse process: here bodies from infinity suddenly appear in space, and there is no way to predict when they will appear. In the Earman paper that you found, this is known as a “space-invader”.
@dga471 , you can find the explanation of how predictability in SR breaks down in the Earman paper in page 1396, section 4.3. He has pictures, which makes it much easier to understand than the paragraph I wrote.
The space-invader solution is described in page 1385, section 3.6.
Overall, you’ve raised some very good points about problems of determinism and predictability even in classical physics. However, to get back to the original question of teleology: first, it seems to me that even if there are cases in classical physics (space invaders, Norton’s dome, etc.) which are not predictable, most other situations we encounter in daily life are in principle predictable and deterministic. (Unless you can show me that these problems are also present even in regular phenomena such as heat-seeking missles.)
In contrast, indeterminism seems to be at the heart of quantum mechanics, such that literally nothing is 100% determined.
In any case, going back to @EricMH’s definition of teleology:
In the case of classical mechanics, most things have P(X)=1 (as in the Laplace demon picture of the world), except for the predictability edge cases listed by @PdotdQ. Thus, according to this definition, most things are intrinsically teleological.
In the case of quantum mechanics, nothing has P(X) = 1. Thus, nothing is teleological. And as @PdotdQ pointed out, this touches every model of the universe which includes QM, including QFT, quantum gravity and combination of that.
So, @EricMH’s definition of intrinsic teleology is either trivial (as it applies to most objects), or inapplicable (as it applies to no objects). In either case, it seems the definition is flawed.
I dispute this. Weather is not predictable. People are not predictable. Stocks are not predictable. All these are chaotic systems too. A large proportion of what we deal with in life is neither predictable nor deterministic from a human point of view.
Chaotic systems are an important example. They are, in principle, predictable. However not practically so, because we do not have perfect knowledge.
I think we should separate predictability and determinism. In the classical picture, in principle most things exactly follow Newtonian mechanics and we would be able to predict it if we had enough computing power. Thus, they are deterministic - they follow certain rules exactly. Chaotic systems in classical physics are also in principle deterministic in this way, even if they are not predictable.
Now, one could argue that if one can’t produce an equation that accurately predicts the behavior of a system, one has not showed that that system is actually deterministic. In other words, even if Newtonian mechanics assumes determinism (except for the edge cases), we have not proven that nature
follows it exactly. Nancy Cartwright has argued against this sort of “physics fundamentalism” in her paper Fundamentalism and the Patchwork of Laws. I appreciate her argument, but it is an argument about nature and whether classical physics applies to it, not classical physics itself. Whereas here, here we are assuming that classical physics applies to all of nature. (Even though we know that it doesn’t because of QM and so on.) We have to use this assumption because otherwise there is no real way to calculate P(X). To calculate P(X) we have to assume some theory of nature.
Okay, so you are saying the weather is deterministic, but not predictable. Okay. However, I do not think we know whether people are deterministic or not. Even from a purely scientific point of view, it is possible quantum noise might be important in neural circuits:
Considering that we are social animals. Just about our entire world is not predictable, and possibly not even deterministic.
You would also need the initial conditions for everything, or conditions at some starting point from which we could begin computing.
I’m inclined to disagree with that.
Newton’s mechanics is itself deterministic. But I don’t think there’s a required assumption of determinism.
I remember struggling with this several decades ago. The world does not look deterministic, even if only because human behavior does not look deterministic. I eventually came to a way of understanding Newton’s mechanics which did not imply determinism.
I agree that that is a tantalizing possibility. After all, once we put quantum mechanics in the picture determinism goes out the door entirely. (Although, yes, we do not know if quantum indeterminacy at the micro-, particle level has non-trivial macro-level consequences.)
I have no idea whether people are deterministic or not from a theistic point of view. We would have to get into the issue of God’s foreknowledge vs. determinism, and libertarian vs. compatibilist free will. But my original intention of responding to this issue was regarding Eric’s probabilistic formulation of teleology. In that case, I believe he thinks it is also applicable to teleology in the things of nature, such as heat-seeking missiles. So people are not what we’re talking about.
I think we are saying the same thing. Newtonian mechanics is deterministic. We don’t know if nature is deterministic. Thus we are also saying that we don’t know if nature follows Newtonian mechanics exactly. (This is even without bringing QM into the picture.)
Can you explain that a little more?
I’ll disagree with that. We do know that there are non-trivial macro-level consequences.
There are numerous research papers on quantum indeterminacy. A published research paper is a non-trivial macro-level consequence.
That’s an interesting way to look at it. But that macro-level consequences involves human agents. And human agents, with free will and all the uncertainty and unpredictability that comes with that, are difficult to incorporate into physics as part of the theory itself. So I guess what I’m saying is that we don’t know if there are entirely natural, mechanistic macro-level consequences.
Let me add a quick comment on this:
Norton’s dome is indeed not an issue, because it requires very specific configurations. However, space invaders are problematic because one does not know when/whether the invader appears prior to it appearing. In other words, in a world with only classical mechanics without SR, there is no way to know that we are in a daily life situation that are in principle deterministic.
As I mentioned in a previous post, if one gives up locality, it is possible to have a deterministic QM. One can also have a (in my opinion nonelegant) deterministic Relativistic QM. I believe the jury is still out on whether there is a field theory extension to deterministic QM. However, giving up locality might run afoul on how one would evaluate Eric’s P(X)'s in practice. I am not sure on this point.
Chaos is an example where predictability can fail in the following sense: given any computational resolution, for a chaotic system I can come up with a configuration that a computer with said resolution will get the time evolution wrong.
I agree with @dga471 that we should separate determinism and predictability. However, I believe that if either determinism or predictability fails, Eric’s P(X) program is not tenable. If determinism fails, P(X) is never 1, while if predictability fails, it is not possible to evaluate P(X) to test the theory.
Yes, it does. But they chop down a bunch of trees to make paper for printing those research papers. And that’s a macro-level consequence that physics can observe, though it still involves human agents.
Yes. Besides non-locality, one could ensure determinism with QM via the superdeterministic loophole. Basically, our choices are not truly free, even when conducting scientific experiments. Sort of like the quantum version of the Omphalos hypothesis. While this is extremely unpalatable from a purely scientific perspective (as it is unfalsifiable), it occurred to me that it is possible that from God’s POV something like this is the case.
If predictability fails for some cases but not others (while determinism still holds), one could just say that we can prove teleology exists for some systems but not others. I think that would be good enough for a lot of people.