The Argument Clinic

Yes, if “design” were used as a verb not a noun, it would need to be modified by an adverb not an adjective. Thus, you can say “I’m going to commonly design a toaster” – though even that sounds clumsy – and I’m fairly sure @Meerkat_SK5 didn’t mean “It is how I define commonly design, which is…”

All of which goes to show that (i) @Meerkat_SK5 has no idea what they’re talking about, (ii) that they will quibble about every criticism – which in turn means that they will learn nothing from criticism.

I’m wondering if we should keep a running count of how many times you have to say variations of “you misunderstand”/“you’re equating things that have nothing to do with each other”/“that is not a consequence of …”/“that is irrelevant”/“that is incoherent”/“that’s not what the source is talking about”/etc in a single response to them.

2 Likes

Lest anyone miss these:

Hmmm.

Here’s a bunch of assertions without any substance:

So if we measure particle momenta, or even if we don’t track particle positions, it renders advanced life impossible?

There are billions of billions of particles of whose position I’m uncertain. I must not exist.

Note that @Meerkat_SK5 wrote “greater or smaller than half a mile” so locating those particles wouldn’t help - I’d still not exist.

2 Likes

I have done this already but I will go about it differently.

My overall point is that reproduction isn’t necessarily a quality that distinguishes machines from biological systems.

For instance, I have shown before that there is a strong analogy between machines produced by human designers and biomolecular machines. However, this analogy supposedly breaks down because biological systems replicate, whereas devices and artefacts made by human beings don’t.

This fundamental difference is so significant that it invalidates the conclusion that design can produce phylogenetic patterns as well. Nevertheless, this objection does not take into account manmade machines that do, indeed, replicate.

Although Von Neumann’s universal constructor is a conceptual apparatus now, researchers are actively trying to design and build self-replicating machines as I showed you.

This means that the reproduction objection to the Watchmaker argument cannot be the basis for viewing biomolecular machines as fundamentally different to machines created by human designers. Instead, self-replication is just one more machine-like attribute of biochemical systems as the researchers from Rockefeller university have shown.

Well, you are assuming that only common descent can produce those phylogenetic patterns. But, as I just explained, human designs can potentially do the same.

Now, you have decided to nitpick what I have said. It does not matter whether the quote itself was accurate or not because Sober made it very clear in his article that one phenomenon does not automatically entail the other. They are mutually exclusive concepts. Of course, there are strong correlations between the two, but correlation does not prove causation. More importantly, you can easily replace common descent with common design as a working assumption without recourse, as I just mentioned.

Because the common design model views convergence as resulting from a universal common designer who employs a single, optimal solution to address a common set of problems faced by organisms possessing different characteristics and living in different habitats.

Sorry, I meant to say that the example supports his concepts of analogy and homology, which the quote was referring to.

Sure, I can omit that adjective from our discourse and the model. Point well taken.

If trilobites, allosaurs, rauisuchians, lithornithids, etc. are found to be basic types, then those examples would conflict with the theory. If they are created kinds or species, then it would not conflict with it.

What is incoherent about this?

Yes, I agree. The genome is the blueprint.

The genetic code is the abstract information/instructions plastered on the blueprint/genome.

I know. That is what I am trying to argue instead. I did not mean to suggest the study claimed that every animal group came from slime molds.

I was just trying to help you understand what I mean by separate creation because it seems like we were talking passed each other on this point.

Now, do you see how Owen’s theory involves separate creation, which you refer to as a weird form of common descent?

Oh shoot John, I think we have been talking passed each other on this point as well.

While the fossil record does reveal the small-scale variation of the same species, it does not indicate transmutation. Instead, we see stasis where an animal at the family level remains within the same family. But these animals and organisms have never been observed to change or transmutate into new distinct life forms.

To be clear, I am arguing that orders and family levels within animal clades emerged from slim molds rather than transmutated from animals. The fossil record does NOT reveal transmutation of species. There is no proof:

Transmutation of species and transformism are unproven 18th and 19th-century evolutionary ideas about the change of one species into another that preceded Charles Darwin’s theory of natural selection.[

If that’s your point, you haven’t been making it, and I don’t think anyone would disagree with you. Still, there seems to be no connection to creationism.

That’s not an instance, and it isn’t true either.

It doesn’t, actually. As usual, your argument consists entirely of unconnected statements. And this does nothing to show why Von Neumann machines have anything to do with common design/archetypes.

But your explanation only makes sense if these human designs — only the initial ancestor of which is actually designed — are related by common descent. That’s nested hierarchy produced by phylogeny. Your argument defeats itself.

Saying that you’re completely wrong and have quoted irrelevancies isn’t a nitpick. It’s a wholesale rejection of your incoherent argument. And you just add more incoherence to it, as when you say:

Priceless.

Ah, but the solutions are seldom identical, and similarities are often explicable by similar physical requirements, as for example streamlining in nektonic species. Again we have to ask why cephalopod and vertebrate eyes are so similar in outline but so different in detail.

Try saying what you mean the first time. But this is no improvement.

That wasn’t the point. The point was that you are incoherent.

Everything. But start with the fact that you have private definitions for everything, and you use them inconsistently at that. Basic types and created kinds are generally considered synonymous, for example. But how would you characterize trilobites, allosaurs, rauisuchians, and lithornithids? Are they extinct?

Then you should stop claiming otherwise.

That too is incoherent. And no, the genetic code is in fact the mapping between codons and amino acids. You have no clue.

No. Please stop using your personal and obscure definitions for everything. Impenetrability, that’s what I say.

More gibberish. There is no help for you. Transitions between families are in fact plentiful in the fossil record. It’s the transitions between species (and here I use the term in its standard meaning, not whatever yours may be) that are rare. Did you ever actually read anything by Gould?

Where is that quote from? What does it mean by “species”?

4 Likes

Let’s discuss a limited topic: your definitions of these terms:

How do these three things differ? Are you saying that species are the extant representatives of basic types, while created kinds are extinct ones? But then created types produce species, and so, apparently, do basic types. Very confusing.

Assertion without evidence.

That has nothing to do with HGT.

Again, we can watch HGT occurring in real time. It is a natural process without any common designer in sight.

But why is HGT common design but VGT is not? We can observe both occurring naturally.

Assertion without evidence. RNA viruses require living organisms in order to reproduce. They lack the genes necessary to replicate their own genomes.

That’s ridiculous. The amount of junk DNA differs greatly between species. There is absolutely no reason why the amount of junk DNA in bacteria can be used to determine the amount of junk DNA in complex eukaryotes.

Then how do you explain the nested hierarchy connecting the orders and families to basal groups that aren’t slime molds?

It is a simulation of vertical inheritance and natural selection.

When humans design organisms we regularly violate a nested hierarchy. So why don’t we see these violations in extant or extinct species?

Do you even know what a nested hierarchy is, because nothing you have written even addresses them.

You didn’t answer my questions.

How many ERV’s does p53 interact with, and how many of those interactions are functional?

p53 interacts with many human sequences that aren’t ERV sequences, so why would a sequence need to mimic ERV sequence in order to have function?

3 Likes

You need to have faith to see the fairies carry out HGT.

2 Likes

Apparently. Just ignore all of those natural processes that are swapping DNA between organisms.

image

No sex pili there. Nope.

2 Likes

Alright, I will show you why there is a strong analogy.

DNA contains two types of information: digital, represented by genetic information, and analog, represented by the genome; both are present in DNA and have many properties identical to those in man-made computers and linguistic texts.

For example, logic gates convert an input signal into an output, according to modern computers. Typically, these outputs are AND, OR, and YES. Using the output of one logic gate as the input to another, computer experts can develop sophisticated networks of gates to compute answers to complex questions.

Researchers noticed that the beginning of DNA replication also operated as a logic gate at the molecular level. They were able to modify the biochemical systems involved in DNA replication to input AND, OR, and YES logic gates that could respond to chemical inputs provided by scientists in a laboratory setting. [30] Another example is the genome of the bacterium E. coli and how it functions like the Linux computer operating system (OS), as revealed by other researchers. [31]

Moreover, the information content of proteins bears almost the same mathematical structure as human language. This involves discrete mathematics or statements of logic and language that humans use to communicate with each other every day, such as phrases, signs, and symbols that are meaningful and personal. [32, 33]

Of course, some scientists insist that the comparisons between DNA information and human information are merely used in a metaphorical sense. [34] This is contradicted by biotechnologists, who have shown that the protein machines that operate on DNA during processes, such as transcription, replication, and repair, operate very similarly to a computer system. In fact, the similarity is so striking that this insight has brought forth a new area of nanotechnology called DNA computing. [35]

[30] DNA as a logic operator | Nature

[31] Comparing genomes to computer operating systems in terms of the topology and evolution of their regulatory control networks | PNAS

[32] Two genetic codes: Repetitive syntax for active non-coding RNAs; non-repetitive syntax for the DNA archives - PMC (nih.gov)

[33] Grammar of protein domain architectures | PNAS

[34] Why Machine-Information Metaphors are Bad for Science and Science Education | SpringerLink

[35] Leonard M. Adleman, 1998. Computing with DNA. Scientific American, pp. 54–61.

I have already explained how. I will try to be more clear this time.

A complete self-replicating automaton must consist of four components: a universal constructor (UC), a (instructional) blueprint, controller and a supervisory unit. These functional components are required to produce successive generations of artificial life.

DNA and HGT or ribosome would be the biological version of Von Neuman’s Universal constructor (UC) and blueprint:

"To a rough approximation, all known life contains these three components, which is particularly remarkable, given that von Neumann formulated his ideas before the discoveries of modern molecular biology, including the structure of DNA and the ribosome. From the insights provided by molecular biology over the past 50 years, we can now identify that all known life functions in a manner akin to von Neumann automata, where DNA provides an (partial) algorithm, ribosomes act as the core of the UC and DNA polymerases (along with a suite of other molecular machinery) play the role of a supervisory unit [60,61].7

In spite of the striking similarities between an UC and modern life, there are some important differences. DNA does not contain a blueprint for building the entire cell, but instead contains only small parts of a much larger biological algorithm, which may be roughly described as the distributed ‘top-down’ control of an organism.

The UC from Von Neumanns theory needs a programmer to both the UC and the supervisory unit:

The UC forms the foundation of von Neumann’s theory on self-replicating automata. However, an UC is a mindless robot, and must be told very specifically exactly what to do in order to build the correct object(s). It must therefore be programmed to construct specific things, and if it is to replicate then it must also be provided with a blueprint of itself.6
…This dual hardware/software role mirrors precisely that played by DNA, where genes act both passively as physical structures to be copied, and are actively read-out as a source of algorithmic instructions. To implement this dualistic role, von Neumann appended a ‘supervisory unit’ to his automata whose task is to supervise which of these two roles the blueprint must play at a given time, thereby ensuring that the blueprint is treated both as an algorithm to be read-out and as a structure to be copied, depending on the context. The algorithmic origins of life - PMC (nih.gov)

Von nueman’s universal constructor theory:

  1. Blueprint
  2. Universal constructor
  3. Programmer or controller
  4. supervisory unit

Universal common designer theory

  1. DNA blueprint
  2. HGT or ribosome
  3. Common designer
  4. DNA replication

This is a fundamental error that you can’t stop making here. There is no evidence of common descent in living or fossil organisms. There is nothing to infer from nature that would compel anybody to make this conclusion.

The simulation of common descent is just in your mind and you guys are imposing it onto the data just like Darwin did in his day. All he did was modify Owen’s theory from common archetype to common ancestor and then assumed (like you guys do) that living organisms evolved from these ancestors. Even two of the architects of the modern synthesis recognized this:

One would expect a priori that such a complete change of the philosophical basis of classification would result in a radical change of classification, but this was by no means the case. There was hardly any change even in method before and after Darwin, except that the “archetype” was replaced by the common ancestor- Ernst Mayr

From their classifications alone, it is practically impossible to tell whether zoologists of the middle decades of the nineteenth century were evolutionists or not. The common ancestor was at first, and in most cases, just as hypothetical as the archetype, and the methods of inference were much the same for both, so that classification continued to develop with no immediate evidence of the revolution in principles. …the hierarchy looked the same as before even if it meant something totally different.- Gaylord Simpson

Common design was first. They stole the evidence and used it to support their claims.

If you don’t disagree, then why are you claiming common descent is the only process that could produce nested hierarchies?

Also, I am not presenting a YEC model of evolution or creation. So why would you expect there to be a connection?

Although all living things have the same basic structure , everyone has a different sequence of nitrogenous bases in their DNA which make them unique from everyone else. The sequence of base pairs is our “genetic code”

No, it is more than just that:

“the key distinction between the origin of life and other ‘emergent’ transitions is the onset of distributed information control, enabling context-dependent causation, where an abstract and non-physical systemic entity (algorithmic information) effectively becomes a causal agent capable of manipulating its material substrate
The algorithmic origins of life - PMC (nih.gov)

“In mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ (listen)) is a finite sequence of rigorous instructions, typically used to solve a class of specific [problems](Computational problem - Wikipedia) or to perform a computation.[1] Algorithms are used as specifications for performing calculations and data processing.”

Algorithm - Wikipedia

Are you talking about transmutation of families and orders exist in the fossil record? If so, what source do you have to substantiate this claim?

I gave examples already but I wll give you another.

Both extant and extinct Horses would represent kinds that reproduce after its kind.

Both extant and extinct Donkeys and Zebras would be the species from the Horse kind that changed overtime (i.e. divergent evolution)

The Equidae, would be the basic type which comprises all three including past and present generations.

Rhinos and Tapirs each would be created kinds. However, they also can be considered basic types because it is unlikely that they have populations of them that diverged and changed overtime.

Either a created kind or a species but NOT a basic type, which encompasses both.

Quantum physics experiments have revealed that there is no concrete physical reality made of classical space-time constituents. Instead, the so-called material realm actually exists in a super-positional state of all quantum possibilities (i.e., wave functions) that are mathematical in nature.

The intangible phenomenon of conscious observership is the only means capable of “collapsing” any given combination of quantum wave functions, which imparts a concrete and physical reality to them: “No naive realistic picture is compatible with our results because whether a quantum could be seen as showing particle-like or wave-like behavior would depend on a causally disconnected choice.” [21]
https://www.pnas.org/doi/10.1073/pnas.1213201110

This shows how the right fine-tuning values were carefully chosen to allow advanced life to exist from the beginning leading up to the present.

It actually has everything to do with it because HGT or the genetic code would represent the particle-like or wave-like behavior in quantum systems that depend on a causally disconnected choice to function and exist.

Because VGT primarily a materialistic process that is primarily relying on the rules of classical physics and analogue information. In constrast, HGT relies on the rules of quantum physics and digtial information.

No, I provided evidence for my claim already. You just did not read the source:

Are viruses our oldest ancestors? | EMBO reports (embopress.org)

Of course, it can because complex eukaryotes represent millions of bacteria or am I missing something here.

My overall point is that reproduction isn’t necessarily a quality that distinguishes machines from biological systems.

For instance, I have shown before that there is a strong analogy between machines produced by human designers and biomolecular machines. However, this analogy supposedly breaks down because biological systems replicate, whereas devices and artefacts made by human beings don’t.

This fundamental difference is so significant that it invalidates the conclusion that design can produce phylogenetic patterns as well. Nevertheless, this objection does not take into account manmade machines that do, indeed, replicate.

Although Von Neumann’s universal constructor is a conceptual apparatus now, researchers are actively trying to design and build self-replicating machines as I showed you.

This means that the reproduction objection to the Watchmaker argument cannot be the basis for viewing biomolecular machines as fundamentally different to machines created by human designers. Instead, self-replication is just one more machine-like attribute of biochemical systems as the researchers from Rockefeller university have shown.

No, I did. My answer is we don’t know yet. Further research should reveal that these sequence elements function as part of the innate immune system, helping to ward off retroviral infections through a variety of mechanisms that life scientists are just beginning to understand.

Only by appearance though. Let me elaborate.

A complete self-replicating automaton must consist of four components: a universal constructor (UC), a (instructional) blueprint, controller and a supervisory unit. These functional components are required to produce successive generations of artificial life.

DNA and HGT or ribosome would be the biological version of Von Neuman’s Universal constructor (UC) and blueprint:

"To a rough approximation, all known life contains these three components, which is particularly remarkable, given that von Neumann formulated his ideas before the discoveries of modern molecular biology, including the structure of DNA and the ribosome. From the insights provided by molecular biology over the past 50 years, we can now identify that all known life functions in a manner akin to von Neumann automata, where DNA provides an (partial) algorithm, ribosomes act as the core of the UC and DNA polymerases (along with a suite of other molecular machinery) play the role of a supervisory unit [60,61].7

In spite of the striking similarities between an UC and modern life, there are some important differences. DNA does not contain a blueprint for building the entire cell, but instead contains only small parts of a much larger biological algorithm, which may be roughly described as the distributed ‘top-down’ control of an organism.

The UC from Von Neumanns theory needs a programmer to both the UC and the supervisory unit:

The UC forms the foundation of von Neumann’s theory on self-replicating automata. However, an UC is a mindless robot, and must be told very specifically exactly what to do in order to build the correct object(s). It must therefore be programmed to construct specific things, and if it is to replicate then it must also be provided with a blueprint of itself.6
…This dual hardware/software role mirrors precisely that played by DNA, where genes act both passively as physical structures to be copied, and are actively read-out as a source of algorithmic instructions. To implement this dualistic role, von Neumann appended a ‘supervisory unit’ to his automata whose task is to supervise which of these two roles the blueprint must play at a given time, thereby ensuring that the blueprint is treated both as an algorithm to be read-out and as a structure to be copied, depending on the context.

Von nueman’s universal constructor theory:

  1. Blueprint
  2. Universal constructor
  3. Programmer or controller
  4. supervisory unit

Universal common designer theory

  1. DNA blueprint
  2. HGT or ribosome
  3. Common designer
  4. DNA replication

Well, you are assuming that only common descent can produce those phylogenetic patterns. But, as I just explained, human designs can potentially do the same.
Even two of the architects of the modern synthesis recognized this:

One would expect a priori that such a complete change of the philosophical basis of classification would result in a radical change of classification, but this was by no means the case. There was hardly any change even in method before and after Darwin, except that the “archetype” was replaced by the common ancestor- Ernst Mayr

From their classifications alone, it is practically impossible to tell whether zoologists of the middle decades of the nineteenth century were evolutionists or not. The common ancestor was at first, and in most cases, just as hypothetical as the archetype, and the methods of inference were much the same for both, so that classification continued to develop with no immediate evidence of the revolution in principles. …the hierarchy looked the same as before even if it meant something totally different.- Gaylord Simpson

Common design was first. They stole the evidence and used it to support their claims.

“Look at the hippos, they’re wriggling their ears; somebody shoot me for I’m bored to tears”

5 Likes

That’s just word salad. How is digital information “represented” by genetic information? How is analog information “represented” by the genome?

The reference is paywalled, but I don’t think that claim is true, and it probably isn’t in the paywalled reference.

Not quite: “We show that both networks have a fundamentally hierarchical layout, but there is a key difference”. Even the abstract refutes you.

Sort of, in some respects. None of that, even if true, is relevant to your claim about this supposed strong analogy. The genome is not a biomolecular machine, as the term is commonly understood. Machines do work.

If you’re trying to be more clear, why are you just repeating previous statements, word for word?

Nobody says that, and your quote certainly doesn’t say that. Your analogy is unsupported by anything except your own blatant assertion.

Of course there is plenty of evidence for such things, but all that is irrelevant to the point under discussion, whether von Neumann machines (hypothetically, as no actual machines exist) display common descent.

Bet you never read the actual sources those quotes came from. This is suggested by your attribution of the second to somebody names “Gaylord Simpson”. Neither of them supports your claim. Now it’s true that one can discern a nested hierarchy without supposing common descent. Common descent provides the explanation for the existence of that hierarchy.You have the whole thing backward. Prior to the theory of common descent, there was no viagle explanation for nested hierarchy. Contrary to your unsupported claim, nested hierarchy is not an expectation of the archetype theory.

Because what I’m not disagreeing with is irrelevant to the question. Reproduction, whether biological or machine, is common descent.

Nobody accused you of YEC. You’re a sort of OEC. Still creationism.

No it isn’t. That’s not what the term means. And your claim is again irrelevant to the point. Reasoned discussion is clearly not one of your skills.

And once again your quotes are irrelevant to the question at hand. Please try to understand that you are grossly incompetent. This sort of thing — coherent argument — is just beyond your cababilities.

Well, one well-attested transition is between more primitive synapsids and mammals, as viewed through the evolution of the middle ear bones, for which the fossil record is particularly good. Here is one review of the topic:

Incoherent. You can’t expect anyone to respond to that.

What’s the difference between a created kind and a species? How would you tell which they are? Your incoherence is fundamental.

2 Likes

That is not a widely accepted conclusion in physics.

Why wouldn’t vertical inheritance also qualify?

No, it doesn’t. HGT is just as much a materialistic process as VGT. For example, here are two bacteria participating in HGT through a sex pili:

image

How is that not materialistic?

Did you read it?

Machines that evolved from a common ancestor through vertical inheritance will produce a nested hierarchy. Your alleged archetypes fall into a nested hierarchy. This is evidence that your archetypes evolved from a common ancestor.

p53 interacts with non-retroviral sequence, and it does so in a way that is functional. So why would you need retroviral sequence?

Whether or not the last universal common ancestor was created in such a way does not affect the conclusion that all life since that last universal common ancestor has evolved and shares a common ancestor.

Human designs can also produce massive and numerous violations of a nested hierarchy. Common descent with vertical inheritance can not produce anything other than a nested hierarchy. There is no reason why we would expect a nested hierarchy if common design is true because there are billions and billions of other patterns that common design could produce.

4 Likes

Correct. “Digital” has no special meaning for information. Sometimes “bits” are the convenient unit, which indicates a logarithm base-2 is used.

The only meaningful coding for genetic information are the laws of chemistry. Even then, “coding” is only needed as a description for understanding human.

5 Likes

Why do you say that wave-function collapse occurs specifically because of “conscious observership”? What even is “conscious observership”, and what physical experiment could we hypothetically perform in the hopes of testing whether or not it (what ever it is) has an impact on the persistence or collapse of wave-functions?

1 Like

Are you suggesting that @Meerkat_SK5 might be lying about his sources?

It may be purely coincidence that ‘ET’ posted exactly the same two quotes at uncommon descent (twice), in the same order, with the same ellipsis, the same punctuation, the same lack of citation and the same missing forename; and that the same two quotes can be found juxtaposed (and with the same ellipsis) in Denton’s abysmal Evolution: A Theory in Crisis.

Such similarities could merely be a result of comparable thought processes, carelessness and ignorance. To justify such an accusation on such ‘flimsy’ evidence would require a long history of previous instances of @Meerkat_SK5 lying about his sources, and that would be extremely easy to find.

No, since he never says what the sources are. Lying by omission, perhaps.

More to the point, you appear to have uncovered a phylogeny of quote mining, though a very simple one, based on shared derived characteristics. The same method is used to produce phylogenetic trees of illuminated manuscripts. Once again, nested hierarchy is due to common descent.

Howe C.J., Barbrook A.C., Spencer M., Robinson P., Bordalejo B., Mooney L.R. Manuscript evolution. Trends in Genetics 2001; 17:147-152.

Digital information is composed of abstract entities involving discrete mathematics or statements of logic that apply to and must exist in all possible worlds. It involves language that humans use to communicate with each other every day, such as phrases, signs, and symbols that are meaningful and personal. Analog information refers to continuous or redundant but orderly complex patterns of information reflected within the laws of nature

Furthermore, the nucleotide sequence both specifies the digital information of the gene and the higher order architectures of the genome, which have an impact on the expression of the digital information found in the gene.

How does this difference make the analogy breakdown?

Because you are not responding to it adequately or at all. Instead, you are just making a bunch of claims without any substance behind them, such as this…

What are you referring to? Nobody says what? Again, the quote specifically said,

…we can now identify that all known life functions in a manner akin to von Neumann automata, where DNA provides an (partial) algorithm, ribosomes act as the core of the UC and DNA polymerases (along with a suite of other molecular machinery) play the role of a supervisory unit [60,61].7

Von nueman’s universal constructor theory:

  1. Blueprint
  2. Universal constructor
  3. Programmer or controller
  4. supervisory unit

Universal common designer theory

  1. DNA blueprint
  2. Ribosome
  3. Common designer or consciousness
  4. DNA replication

How come the quote does not support my analogy? Explain.

Von Neuman machines give the appearance of common descent because of the motives and mechanisms being used naturally produce the effect. But, in reality it is common design.

Just like biology is considered the study of complicated things that have the appearance of having been designed with a purpose. But, in reality, it is just common descent according to secular scientists.

I want you to tell me why you think Von Neuman’s machines are VGT in reality rather than just appearance. Here is a list of differences between VGT and HGT, which should help you in your determination:

Common descent (VGT)

  1. Classical physics
  2. Analogue information
  3. Bottom-up processes
  4. Matter and energy are fundamental
  5. Point mutations and gene duplication
  6. Undirected process
  7. Appearance of design

Common Design (HGT)

  1. Quantum physics
  2. Digital information
  3. Top-down processes
  4. Information is fundamental
  5. Large-scale mutations
  6. Directed process
  7. Appearance of descent

That can’t be true. God was the explanation for those patterns before Darwin:

Now, since the days of Linnæus this principle has been carefully followed, and it is by its aid that the tree-like system of classification has been established. No one, even long before Darwin’s days, ever dreamed of doubting that this system is in reality, what it always has been in name, a natural system. What, then, is the inference we are to draw from it?

An evolutionist answers, that it is just such a system as his theory of descent would lead him to expect as a natural system. For this tree-like system is as clear an expression as anything could be of the fact that all species are bound together by the ties of genetic relationship. If all species were separately created, it is almost incredible that we should everywhere observe this progressive shading off of characters common to larger groups, into more and more specialized characters distinctive only of smaller and smaller groups. At any rate, to say the least, the law of parsimony forbids us to ascribe such effects to a supernatural cause, acting in so whimsical a manner, when the effects are precisely what we should expect to follow from the action of a highly probable natural cause.

… Now what should we think of a philologist who should maintain that English, French, Spanish, and Italian were all specially created languages—or languages separately constructed by the Deity, and by as many separate acts of inspiration communicated to these several nations—and that their resemblance to the fossil form, Latin, is to be attributed to special design? Yet the evidence of the natural transmutation of species, is, in one respect, much stronger than that of the natural transmutation of languages—in respect, namely, of there being a vastly greater number of cases all bearing testimony to the fact of genetic relationship.
–George Romanes, “Scientific Evidences of Organic Evolution”, 1882
The Project Gutenberg eBook of The Scientific Evidences of Organic Evolution, by George J. Romanes, M.A., LL.D., F.R.S.

Like I said, God was the explanation for those nested hierarchies. However, it was not a viable scientific explanation until Owen’s theory came along, which invoked separate creation from archetypical laws of nature:

Own was enamored with the new order of nature that he had “proven,” and, extended from its scientific foundations, it became a source of aesthetic as well as scientific value for him. He felt that the discovery of archetypal relationships and the contemplation of such patterns as they continually reappear both within an organism and between species were great sources of joy for the civilized man.

He expounded upon the “satisfaction felt by the rightly constituted mind” when it discovers the “harmonious concord with a common type” [4, p. 38], and he exclaimed “with what new interest must the human anatomist view the little ossicles ofthe carpus and tarsus when their homologies have been determined!” [4, p. 38]. Indeed, there is a grand beauty in the order of nature which opens at the touch of scientific contemplation: “A perfect and beautiful parallelism reigns in the order in which the toes successively disappear in the hindfoot with that of the forefoot . . .” [4, p. 33]. “Consider the beautiful and numerous evidences of unity of plan which the structures of the locomotor members have disclosed . . .” [4, p. 39].

…Upon this structure, Owen was able to superimpose his theory of archetypes and other modifications which had been formulated to explain the lacunae in the chain’s continuity [7].
…Owen explained that each section of the chain had its own archetype and does not have to be temporally complete.

https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.923.6553&rep=rep1&type=pdf

Owen’s theory explained the gaps in the fossil record and nested hierarchies.

It sounds like you are making a semantics argument for common descent that does not make any sense.

Do you consider Sudan cars to be the modified form of vehicles with four wheels that descended from a common vehicle ancestor?

I love how you misspelled “capabilities” as you were telling me how incompetent I am . :joy:

Yes, I did notice the lack of critiques from creationist litertature regarding this transition, especially compared to other so-called transitions (i.e. reptiles to birds). So I can grant this as a possible exception. However, because it is just one example, we have to question that one exception.

Created kinds: A recognizable base form and structure that does not change over time (i.e. horses)

Species: A similar base form and similar surface features that do change over time (i.e. donkeys and Zebras)

When you combine the 34 with the 47 respondents according to this much larger poll, it shows that this is what majority of physicists believe is the case:

Question 5: In your opinion the observer

a. Is a complex (quantum) system:

57

b. Should play no fundamental role whatsoever:

15

c. Plays a fundamental role in the application of the formalism but plays no distinguished physical role:

47

d. Plays a distinguished physical role (e.g., wave-function collapse by consciousness):

34

b2237_Ch-14.indd (arizona.edu)

On the other hand, the evidence supporting quantum mind theory goes a step further in establishing that the consciousness of the observer also has the distinguished role of collapsing the wave function because consciousness under Orch-OR is quantum mechanical in nature. This is what they mean by having a distinguished physical role from the measurement apparatus:

“…The violation of the classical weight structure is similar to the violation of the well-known Bell inequalities studied in quantum mechanics, and hence suggests that the quantum formalism and hence the modeling by quantum membership weights, as for example in [18], can accomplish what classical membership weights cannot do.”

Experimental Evidence for Quantum Structure in Cognition | SpringerLink

No, I did not quite say that. Instead, I said VGT is primarily a materialistic process that is primarily relying on the rules of classical physics and analogue information.

So I grant that they are both materialistic processes, but one is relying on classical physics and bottom-up process, which includes matter and energy. The other one is using quantum physics and top-down processes. Let me try to elaborate on the differences…

The bottom-up approach gathers data from the environment to form a perception. In practice, it is the piecing together of very simple systems to give rise to a more complex system and, thus, make the original systems subsystems of the emergent system. Under VGT, information either emerges from matter or a useful fiction to describe matter.

In the top-down approach, one starts with the big picture or a complete knowledge of the system and breaks it down into smaller segments. Then, the smaller segments are reassembled from scratch to represent the original big picture. Under HGT, information come first and then changes in material things are consciously pursued in accordance with that information.

Let summarize the differences between the two…

VGT

  1. Classical physics
  2. Analogue information
  3. Bottom-up processes
  4. Matter and energy is fundamental
  5. Point mutations and gene duplication
  6. Undirected process
  7. Appearance of design

HGT

  1. Quantum physics
  2. Digital information
  3. Top-down processes
  4. Information is fundamental
  5. Large-scale mutations
  6. Directed process
  7. Appearance of descent

Yes, and you apparently did not read the whole thing…

Chemical reactions in the primordial soup created increasingly complex RNA molecules. This eventually gave rise to ribozymes, catalytically active molecules that have been demonstrated to replicate and evolve in a test tube [2]. Ribozymes are still with us today as viroids in plants: hairpin loop-structured catalytic RNAs that do not code for proteins and lack a protein coat.

Viroids are reportedly free living organisms.

It sounds like you are making a semantics argument for common descent that does not make any sense.

Do you consider Sudan cars to be the modified form of vehicles with four wheels that descended from a common vehicle ancestor?

Like I said, God was the explanation for those nested hierarchies. However, it was not a viable scientific explanation until Owen’s theory came along, which invoked separate creation from archetypical laws of nature:

Own was enamored with the new order of nature that he had “proven,” and, extended from its scientific foundations, it became a source of aesthetic as well as scientific value for him. He felt that the discovery of archetypal relationships and the contemplation of such patterns as they continually reappear both within an organism and between species were great sources of joy for the civilized man.

He expounded upon the “satisfaction felt by the rightly constituted mind” when it discovers the “harmonious concord with a common type” [4, p. 38], and he exclaimed “with what new interest must the human anatomist view the little ossicles ofthe carpus and tarsus when their homologies have been determined!” [4, p. 38]. Indeed, there is a grand beauty in the order of nature which opens at the touch of scientific contemplation: “A perfect and beautiful parallelism reigns in the order in which the toes successively disappear in the hindfoot with that of the forefoot . . .” [4, p. 33]. “Consider the beautiful and numerous evidences of unity of plan which the structures of the locomotor members have disclosed . . .” [4, p. 39].

…Upon this structure, Owen was able to superimpose his theory of archetypes and other modifications which had been formulated to explain the lacunae in the chain’s continuity [7].
…Owen explained that each section of the chain had its own archetype and does not have to be temporally complete.

https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.923.6553&rep=rep1&type=pdf

Owen’s theory explained the gaps in the fossil record and nested hierarchies.

I agree, but God is both human and divine. So we would expect God to be consistent with his nature by necessity.

Fuz Rana will explain it better than me:

Because of their capacity to damage the genome, ERVs typically are framed by biologists as the bad guys in our tale, threatening the integrity of the genome. But is that really the case?

While pursuing an anticancer therapy, a research team from Sweden discovered something quite unexpected when it comes to p53’s role in protecting the genome: it actually teams up with ERVs to activate the immune system, causing the immune system to attack villainous tumor cells that are disguised as ordinary cells.1 The researchers exposed three different types of cancer cells (melanoma, osteosarcoma, and breast cancer) in vitro to compounds that inhibit the proteins MDMX and MDM2. Normally, these proteins inhibit p53. The team reasoned that if these two proteins were inhibited, p53 could be activated to trigger an anticancer response. To the researchers’ surprise, they discovered that activation of p53 led to the expression of ERV sequences. In turn, the accumulation of ERV RNA in the cancer cells triggered the interferon response in these cells. This response regulates immune responses to pathogens and tumor cells.

Triggering the interferon pathway made it appear as if the cancer cells were infected with a virus even though they weren’t—a mechanism dubbed viral mimicry. Presumably, the process of viral mimicry would flag the otherwise “invisible” tumor cells for destruction by the immune system. The researchers discovered that biopsies taken from patients treated with the MDM2 and MDMX inhibitors showed evidence that cytotoxic CD8+ cells had, indeed, infiltrated the tumor.

The researchers propose a model to explain how p53 can paradoxically both repress and enhance the expression of ERV sequences. They argue that when cells are unstressed, p53 binds to ERV sequences in the genome and, working in conjunction with LSD1 and DNMT1, represses these sequences. In stressed cells, the levels of LSD1 and DNMT1 fall, leading to the expression of ERV sequences by p53 binding.

As I told you before Dan, the laws of quantum physics are more fundamental than chemistry. More importantly, an experiment has revealed that this quantum search algorithm is itself a fundamental property of nature. This shows that there is a quantum basis for genetic information, which means it is more than just a description but proscription of nature.

[1908.11213] The Grover search as a naturally occurring phenomenon (arxiv.org)

According to Roger Penrose, the action of consciousness proceeds in a way that cannot be described by algorithmic processes. [8] For instance, conscious contemplation can ascertain the truth of a statement and freely make intellectual and moral judgments. This involves distinguishing between true and false statements or what is morally “right” versus “wrong.”

The only thing in nature that does this is a wave-function collapse. For instance, at small scales, quantum particles simultaneously exist in the superposition of multiple states or locations, described by a quantum wave function. However, these superpositions are not seen in our everyday world because efforts to measure or observe them seemingly result in their collapse to definite states. [5] Why quantum superpositions are not seen is a mystery known as the measurement problem, which seems somewhat related to consciousness. Experiments from the early 20th century indicated that conscious observation caused superposition wave functions to collapse to definite states, choosing a particular reality. Consciousness was said to collapse the wave function under this view. [5]

Moreover, Diederik Aierts [9] demonstrated how these two phenomena are identical by applying the quantum theory to model cognitive processes, such as information processing by the human brain, language, decision-making, human memory, concepts and conceptual reasoning, human judgment, and perception. Owing to its increasing empirical success, quantum cognition theory has been shown to imply that we have quantum minds.

Other empirical data have shown that the brain is a quantum computer that uses quantum mechanical processes, such as quantum tunneling and superposition, [10, 11] explicitly suggesting that we have quantum minds, as the Orch-OR theory predicted (Read section 4.5 OR and Orch-OR of “Consciousness in the universe” by Hammeroff and Penrose for more details). [12]

Lastly, observations and experiments on the fine-tuning constants seem to support an aspect of quantum mind theory called the universal proto-consciousness field theory. This field theory has also been referred to, by Penrose, as objective reduction (OR) and incorporated in his Orch-OR model to explain why humans have consciousness and these fine-tuning constants.

To be clear, quantum mind theory does not advocate for dualism or an additional supernatural force/substance that would operate outside the rules of science. Instead, it advocates for consciousness as an essential ingredient of physical laws that science has not yet fully understood. For more details, please refer to the introduction of “Consciousness in the universe” by Hammeroff and Penrose. [12]

Definition: universal proto-consciousness, the universal self-collapsing wave function.

[8] Penrose, R., 1989. The Emperor’s New Mind: Concerning Computers, Minds, and the Laws of Physics. Oxford University Press, Oxford.

[9] [2208.03726] Human Perception as a Phenomenon of Quantization (arxiv.org)

[10] Nuclear Spin Attenuates the Anesthetic Potency of Xenon Isotopes in Mice | Anesthesiology | American Society of Anesthesiologists (asahq.org)

[11] Live visualizations of single isolated tubulin protein self-assembly via tunneling current: effect of electromagnetic pumping during spontaneous growth of microtubule | Scientific Reports (nature.com)

[12] Consciousness in the universe: A review of the ‘Orch OR’ theory - ScienceDirect

47 respondents is a valid poll of physicists? Seriously?

Can you also point to anywhere in the poll question where it says that ONLY consciousnesses are capable of collapsing wavefunctions?

How is HGT not doing the same thing?

How is passing DNA across a sex pili between two bacteria a top down process using quantum physics? How is it a top down quantum process when a bacteria takes in naked DNA from its surroundings?

So where does this happen in HGT???

We can watch a sex pili join two bacterial cells, and then DNA is passed between them. This DNA can be replicated as an extra chromosome in the bacteria, or it can insert into another chromosome due to homologous recombination. Where do you think consciousness is involved in this process? How is this different from passing on a whole genome to a new cell, or the merging of two haploid cells?

Sedans don’t fall into a nested hierarchy.

Do you still not understand what a nested hierarchy is and why it is evidence for common ancestry?

Then how are humans able to design organisms that violate a nested hierarchy if it is a law of nature?

It doesn’t explain either.

First, how in the world do you determine where the gaps are in the fossil record? Why would you say that there is a gap between two specific species, but not two others? On top of that, how do you explain the fossils that DO fill those gaps, such as transitional hominid fossils? From where I sit, the only way that there would be gaps in the fossil record is if common ancestry is true. Only through common ancestry could we identify where future fossils should fit into the distribution of similarities and differences in species groups.

Separate creation of archetypes also does not explain the nested hierarchy because there is no reason why separately created archetypes would need to fit into a nested hierarchy.

[note: I branched this idea off into a new thread: How Does ID/Creationism Identify Gaps in the Fossil Record? ]

Why would that nature include separately created species that just so happen to produce the same pattern of homologous and divergent features that common ancestry would produce?

Why is ERV sequence necessary to trigger interferon production? What sequence in ERV specifically is required to trigger interferon, and how many ERV’s have this sequence?

2 Likes

I’m going to abandon that topic, since it’s going nowhere. Just understand that everything you say is gibberish.

But that’s not an explanation. An explanation needs to provide a reason why that pattern would be expected rather than some other pattern or no pattern. “God” is not such a reason and offers no expectation of nested hierarchy.

His theory explains neither. You can’t just claim that and be done with it, and your quote from some unspecified source doesn’t explain either.

That would be “sedan cars”? No, but of course cars aren’t von Neumann machines. The latter have literal descent.

Enjoy it. But it’s still true, and the occasional typo does nothing to dispel its truth.

That was just one example out of many. The “reptile to bird” transition is as well documented. Archaeopteryx, all by itself, neatly fills the major gap. But you simply accept the evidence while closing your eyes to the conclusion.

More incoherence. In what way to donkeys and zebras change over time while horses do not? What is a “recognizable base form”? What is a “similar base form”?

1 Like

Hmm this reads a lot like what you said earlier, in post #340:

Now, I understand that you are a very busy person who may not have the time or patience to read a question posed to them and compose a response to match it, when they can instead just copy and paste a wall of text from someplace else, as indicated by the presence of a citation “[5]”, that matches no actual reference, and the fact that said references start their count at eight, rather than one, as they normally would. Reference 5 seems to also have been the one that might have had some relevance with respect to my query, but, alas, it is missing. And because of your evident busy-ness preventing you from taking good care of your own copying and pasting, I do not interpret careless response dumping like this as outright rude quite yet.
Be that as it may, I for one seem to be blessed with slightly more time than that, so when ever I had composed my question, it and its wording were chosen with some amount of care. Nevertheless, though my post was not ignored, the question contained therein was left unaddressed, and my curiosity remains unsated. Perhaps there is yet a chance for me to see it through, so I shall reiterate:

Please, propose a definition of “conscious observership” sufficient to differentiate experimentally between your claim, that it be among the things that induce wave-function collapse, being an accurate descriptor of natural fact and said claim not being an accurate descriptor of natural fact. I am making no assertion (yet) regarding the claim, nor raising a dispute over it. I am asking, merely, what is meant by it and how one would go about trying to take it seriously.

Once the terms of discussion have been clarified, we can then go on to discuss which ever research paper you find crucial to the subject matter. With lots of convincing I might even stoop down to discussing pop-sci books, too, but I shall make no strong promises of that. Prof. Aerts and Prof. Penrose are, of course, most welcome to join our discussion, but until they do, I doubt that it may be productive that any of us go on speculating on the intricacies of their respective worldviews, so I shall abstain from that and focus on the primary literature on quantum theory as understood in physics, of whose ever authorship you deem appropriate, instead.

2 Likes