Neil, I think you never really responded adequately to the example I brought up in the thread about Newtonian mechanics: the g-factor of the electron. Your argument only works in the cases where physics didn’t match up well with reality, and theorists added an additional term or concept to account for that discrepancy. But the g-factor is far more special. It is computed from quantum electrodynamics (QED), which is a theory unifying special relativity and quantum mechanics to describe electromagnetism. So, the ingredients for this theory were already present in the early 20th century. And sure, there are some parts of the QED (such as renormalization) which you can chalk up to “physicists inventing new concepts to redefine reality”. And it took some time for physicists to develop QED. But apart from that, the theory was basically finished by 1948. At that point, Schwinger and Feynman had only calculated the g-factor to be
g_e/2 = 1.00116,
The calculation for this is (relatively) simple and elegant. We are taught it in beginner quantum field theory class, such that even a mathematically limited experimentalist like me can redo it.
Fast-forward to 2008. A group of Japanese theorists had calculated g_e with the same physics as Feynman did in 1948, but with much more computing power, allowing them to calculate more terms in the infinite series of Feynman diagrams. They obtained
g_e/2 = 1.001 159 652 181 88 (78).
(The number in brackets is the uncertainty). Then, the electron g-factor was measured experimentally with incredible precision a few meters away from my office, by trapping a single electron in a Penning trap. (The physics of this experiment is also breathtakingly beautiful - they basically created a giant artificial atom that the single electron “orbited” around.) The result was:
g_e/2 = 1.001 159 652 180 85 (76)
In other words, physics that was “invented” to explain the g-factor to 6 decimal places, was found 60 years later to be accurate to 12 decimal places! I believe that this a prime example of the surprising ability of mathematical methods in physics to predict the natural world. Nobody needed to add additional terms or concepts to get from 6 to 12 decimals. I don’t know if it points to theism, but I find it difficult to explain this away as simply “physicists giving the right names to parts of nature.” Yes, they did that - but they did so using fairly simple, elegant mathematics, and they were immensely successful at that. I could imagine a world where we could only predict g_e to within an order of magnitude, even with centuries of trying. And sure, not all of physics is as precise as this. But even the existence of a single such case is close to miraculous for me.