What you could prove is there is a probability of generating mutual information from a random source. But, this probability drops off exponentially for each bit added. Besides this, nothing more can be proven.
In my argument I’m referring to what you call theoretic algorithmic information. My formulation may be novel due to its reliance on non-novel definitions of information, but in essence is no different than the standard ID argument.
Regarding the definition of complexity, I define it as the Kolmogorov minimal sufficient statistic (KMSS). Only compressible bitstrings have a non trivial KMSS, but most bitstrings are incompressible so their KMSS is trivial. Due to the law of information non growth (I(X:Y) \geq I(f(X):Y)) combining randomness and determinism doesn’t help generate KMSS any more than flipping a coin does.