Dembski: Building a Better Definition of Intelligent Design

I’ve asked the following question to Perplexity AI: Has Shannon used the concept of complexity in his theory of information? Below is the answer I got:

Shannon did not explicitly use the concept of complexity in his original theory of information. However, there is a deep connection between probability and complexity that is evident in Shannon’s information theory.
Shannon’s theory primarily focused on the transmission and communication of information, using concepts like entropy and channel capacity. He converted probabilities to bits, establishing a relationship between probability and information content. For example, an event with probability 1 in 2^100 corresponds to 100 bits of information.
The complexity aspect in Shannon’s work is related to the length of bit sequences being transmitted. Longer sequences have greater information content and complexity, which impacts their transmission across noisy channels.
It’s important to note that the concept of complexity in information theory was more fully developed later by Andrei Kolmogorov. Kolmogorov complexity, also called algorithmic information theory, measures the complexity of strings rather than distributions, but shares many concepts with Shannon’s entropy.
While Shannon laid the groundwork, the explicit incorporation of complexity into information theory came after his seminal work, with later developments building upon and extending his ideas.

Bottom line: Dembski is correct when he connects complexity with probability.