swamidass
(S. Joshua Swamidass)
November 10, 2018, 8:50pm
42
Dan_Eastwood:
In Dembski’s 2005 paper, he defines CSI in a way that is opposite of Kolmogorov Complexity. It’s a direct function of KC/KI, but what Demski defines as MORE CSI implies LESS KC. What Dembski calls information is lack-of-information to the rest of the mathematical world.
The first part of this blog post is my summary of this version of CSI. Dembski has several definitions for it; this is the one I have studied in the most detaily.
That is exactly right. It is now set up as a catch-22. If it is independent entropy, then is design. If it is mutual entropy, then it is design too.
BruceS:
Thanks Dan. Do you know of an answer to the question I posed earlier in this thread to Dr. S asking whether anyone had ever done a summary of all the information arguments put forward by the id community, starting with the math and then moving on to the models and words used, the supposed biological consequences, and the replies from the consensus-scientific community?
The closest we have now is here: Information = Entropy and Chance = Choice . Be sure to follow the link in the OP to BioLogos.
@BruceS you have some knowledge here on this it seems. Perhaps we should start building that resource here on the forum. Is this something you and @Dan_Eastwood would participate with me on?
1 Like