We Are Mystified by Eric Holloway

Yes, this is also correct. Due to the law of information non growth, all order is evidence of ID, so all investigations of order are not being consistently MN. Hence, Hume’s argument against induction, and more recently Wolpert’s No Free Lunch Theorem, which are consistent applications of MN.

But, you are right, it is somewhat scientifically dissatisfying if this is all that ID claims, although such assumptions have motivated many of the major scientific breakthroughs (i.e. Newton). Science requires some empirical way to differentiate hypotheses, and that is where the information non-growth theorem comes in. Insofar as you can quantify the existing mutual information in a system, the theorem says a naturalistic (chance+determinism) system cannot produce more than it contains. On the other hand, if the system contains intelligent agents, then there can be a net gain in mutual information.

While it is clearly not straightforward to test, there is at least a notion of empirical testability here, and the information theory it is based on has been applied in many practical scenarios (albeit the Shannon form is more widely applied than the Kolmogorov form).

To respond to another comment, when I say “information” I mean “mutual information”. Yes, it is confusing. I could use the acronym MI all the time if that would reduce confusion.

2 Likes