That doesn’t answer what you meant by “it”, you just copy&pasted the rest of your claim that never had relevancy to the challenge you put it as a reply to. Ironic how you would paste an answer without reading the question, and upon query for clarification re-paste it as if it was your interlocutors who don’t read the discussion they are part of.
And I never suggested that confirmation is what differentiates predictions from accomodations either. I also never asked you to show how to test for the fine-tuning constants. This is because I don’t know what “to test for fine-tuning constants” means and would never ask anything so silly. Model parameters aren’t bird races. Their existence is not in dispute. I’ll go ahead and quote myself for you, though, just so you have an opportunity to recognize your mischaracterization. I’d have thought you’d have caught the quote when ever you put it in your post, but evidently that is not how you compose them, so here goes:
And then go on to never actually help me fill in those blanks? Alright, too bad, I guess.
Incorrect. It would, if anything, show a world in which a possible configuration of life could have evolved without the cosmological constant having one unchanging value. The fine-tuning argument does not specify that the cosmological constant has to be one of its fine-tuning parameters in the first place. If it turned out that it isn’t constant after all, proponents of the argument willing to recognize such a fact would just strike it off their list of fine-tuning constants and get on arguing that the rest of them had to be fine-tuned for life to come about. And if some more fundamental parametrization was found from whence the evolution of the cosmological constant could be accurately modeled, then the parameters of that new model would make it on the list of fine-tuning constants and the argument would remain the same. Perhaps it is such proponents of fine-tuning arguments that you should have a chat about falsifiability with, if I may so suggest.
As mentioned earlier, this response does not actually produce a logical link between your “theory” and the “predictions” you say it entails. So you just wasted both our times on irrelevant babble unrelated to the actual conversation. Clearly you do not read your interlocutors’ messages, and you make every effort of showing as much shy of the much more efficient method of outright saying so.
No, it should not. This brief video - Leonard Susskind’s opinion on having been done so dirty for I shan’t speculate about - is not who made those statements in your message. You are. I did not ask the video what you meant by them. I asked you what you meant by them. Granted, the video makes a similar claim in an actually coherent wording. Unlike you, I should add. And I’ll be glad to ask its script’s authors what they meant by their variant when ever they come here to discuss it. For now, I’m talking to you, waiting for you to explain what you meant by the things you said.
No, it does not. Nowhere in this article is stated how one can test the statement “the cosmological constant is placed at a precise measurement of 10¹²⁰”. Nowhere in this article is stated how one can test the statement “the degree of precision becomes an astounding value of 10^(10¹²³)” either. Please, read the things you quote, so you have an opportunity to not lie about what they say.
Alright, let’s see.
So the first link you put is a redirect link starting at YouTube, for some reason. The redirect is to a pop-sci blog article which I couldn’t care less about, either. However, at the bottom, that article links to a pre-print of a 2017 paper that only ever made it into a philosophy of mind book, rather than any peer reviewed physics journal. It did not report any experimental studies, but it did at least suggest something. Neither the pop-sci article nor the primary source make a mention of superdeterminism. The pop-sci article does mention free will, but the primary source does not.
The second link you put is a link to a Wikipedia article. It begins with
The Penrose interpretation is a speculation…
and goes on to mention proposed testing methods. The latest such proposition it names was published back in 2003. You’d think if it was doable and interesting something would have come of it in the past twenty years, but the source you picked makes no mention of that. It also makes no mention of free will or superdeterminism, and neither does the 2003 publication.
Brevity is not what makes an article not a waste of time. Relevancy is. And as with everything else I am responding to in this message, these have none.
Tomorrow is April 1st.
I predict that @Meerkat_SK5 has been working for a year to deliver the best April Fools Day leg pull in Peaceful Science history.
Sorry, I misunderstood what you meant when you said “Until we come by means to tweak any of them”. I misconstrued “tweak” for “test”.
Let me change up my response now.
Even if we tweaked them, it would not do anything to support God’s existence one way or another. God’s creating the world is not a matter of making it the case that this specific thing happens in the world rather than that specific thing. Instead, creation is a matter of making it the case that there is any world at all.
Moreover, the theory holds that the fact that there is any world at all is something that could not even in principle have obtained in the absence of God. For the common designer theory, if we’re talking about a view according to which the world might have existed apart from God, but simply happens not to do so, then we’re not really talking about my theory but rather about something that only superficially resembles it.
This is why the answer I gave you twice before continues to be my answer. Testing whether the constants may have varied throughout time and space is synonmous with testing for the existence of God. Because God by definition is the self-collapse of the universal wave-function, which is the causally disconnected choice among a plethora of possible choices or values
I think I have already responded to this above. I would just be repeating myself, which I am sure you are already tired of me doing anyways.
Sorry for avoiding filling in the blanks. I am not very good at syllogisms.
Yes yes yes. That’s what I meant.
Yes, that is possibly true for them since they are using a philosophical argument for God rather than a scientific one like me.
Actually, the more I think about it. I don’t think anymore that the prediction I provided is supposed to disprove my theory. As you suggested, we would have to tweak the cosmological constant to see if life of any kind would still be likely to arise even if the cosmological constant were very different.
Instead, it would just refute or weaken the scientific argument for God’s existence.
A precise measurement of 10¹²⁰ refers to a value that is incredibly large, roughly equal to the estimated number of particles in the observable universe. This number is so large that it is difficult to conceptualize, but it gives an idea of the scale of the universe and the challenges involved in studying it.
The cosmological constant is a value used in cosmology to describe the acceleration of the expansion of the universe. It is typically denoted by the Greek letter lambda (Λ). The value of the cosmological constant is usually expressed in units of energy per unit volume of space.
The cosmological constant is said to be “placed at” a certain value when scientists have determined its approximate value through various observations and measurements. The current best estimate for the cosmological constant is around 10^-29 g/cm^3, which corresponds to a vacuum energy density of roughly 10^-9 joules per cubic meter.
The degree of precision refers to the level of accuracy of a measurement or calculation. In the case of the cosmological constant, a high degree of precision means that scientists have been able to determine its value to a very small margin of error.
It is theoretically possible for the value of the cosmological constant to become an astounding value of 10^(10¹²³), but this would require a drastic change in our understanding of the universe and the underlying physics that govern it. This value is so large that it is difficult to imagine how it could occur in reality because no force in the history of cosmology has ever been discovered to be that finely-tuned.
As I mentioned before, finding different values for the cosmological constant in the past would only weaken the scientific argument for God’s existence rather than refuting God’s existence.
My mistake. I just realized that superdeterminision does not even support my theory in the first place because it presupposes a material cause or a mindless force that predetermined our decisions.
If you want an experiment that actually tested Penrose’s interpretation, here is a recent article on it:
Confirming Orch-OR theory’s predictions are what demonstrate free-will.
God’s purpose for creating animals and humans the way they are is to survive and reproduce under a particular environment AND fill other environments around the globe.
As a result, we would expect to see a creationary phylogenetic tree that is very similar in form and function to the evolutionary tree.
However, we would also expect to see a creationist tree that traces life back to a number of unrelated populations (i.e. separate created kinds) that roughly resemble the forms of life we see today.
This means that finding examples of family trees based on anatomical features that contradict family trees based on molecular similarities would support the model.
For instance, there are instances of high gene-tree conflict in major clades, which correspond to rate increases in morphological innovation.
The study concluded that there is “an important link between genomic and morphological evolution at deep timescales. We suggest that episodic evolutionary and population events leave signatures of conflict within genomes that may offer important insight on the processes responsible not only for conflict but also for massive changes in phenotype across disparate lineages.” Phylogenomic conflict coincides with rapid morphological innovation | PNAS
This frequently happens at the family level as well:
These results would be expected if Darwinian evolution only explains modest differences among closely related species. On the other hand, the various similarities and differences across plants and animals of widely varying types would be primarily due to a universal common designer reusing the universal common blueprint for common purposes and fresh DNA sequences for innovations.
In other words, “the persistent failure of a single tree of life to emerge makes perfect sense: there is no evolutionary tree of life, because common descent isn’t the case. A common designer is.”
What are you talking about. Horizontal regulatory transfer leads to substantial innovation in one scoop, rather than point mutations or gene duplication and parent to child offspring.
So it can’t be by common descent because it is not relying on the principles of natural selection and random mutations, which produces a gradual process rather than a saltational one.
I gave you the reason already. The various similarities and differences across plants and animals of widely varying types would be primarily due to a universal common designer reusing the universal common blueprint for common purposes and fresh DNA sequences for innovations.
Moreover, if God’s purpose is to make sure animals survive and reproduce under particular environments around the globe through the process of HRT, then this would naturally produce nested hierarchies by default since we do find that more closely related species are more ecologically similar.
For example, a study published in the journal “Ecology” in 2012 examined the relationship between phylogenetic relatedness and functional similarity in a group of plant species in a grassland ecosystem. The study found that more closely related plant species were more similar in their functional traits, such as leaf area and photosynthetic rate, than more distantly related species. Moreover, phylogenetic relationships influence the strength of species’ interactions, which in turn structures community composition. https://www.pnas.org/doi/10.1073/pnas.1013003108
If this is what you mean by the standard meaning, then yes:
…the key distinction between the origin of life and other ‘emergent’ transitions is the onset of distributed information control, enabling context-dependent causation, where an abstract and non-physical systemic entity (algorithmic information) effectively becomes a causal agent capable of manipulating its material substrate. (https://royalsocietypublishing.org/doi/10.1098/rsif.2012.0869#d1e518)
DNA stores information using four nucleotide bases. The sequences of nucleotides encode information for constructing proteins from an alphabet of 20 amino acids.
After checking back our previous conversations on this point, I think I see what you are really asking for now. You are not asking for a common design model that equally explains the pattern of Hox clusters. Instead, you want a common design model that makes new or different predictions for specifically the pattern of Hox clusters within created kinds. I admit that I don’t have one just yet. All these are for future tasks, I guess.
Your conclusion doesn’t follow from your premise, so what you have there is not an argument at all. Note, by the way, that a phylogenetic tree does not have a function, though it does have a form, i.e. that of a branching tree.
But that contradicts your first expectation. Of course what we see today is not a number of unrelated populations, so your second expectation is not seen.
Again, we have a conclusion that doesn’t follow from the premise.
Without knowing what major clades you’re talking about, there is no way to know if that claim is true. Note, by the way, that “clade” refers to a group related by common descent, so you need some other term to refer to unrelated taxa.
I doubt that you ever looked at the actual publication. Am I correct?
Or that one either. Am I correct?
They would not. Tell me, do the papers you reference provide hypotheses to explain their results?
An unattributed quote that only makes an unsupported claim. Not helpful.
We were, supposedly, talking about genes, not regulatory elements.
Common descent is not logically dependent on natural selection and random mutations.
That’s not a reason. There is no reason to expect a nested hierarchy to arise from the process you describe.
Similarity and nested hierarchy are not the same thing. And what happens to closely related species is not relevant to a discussion of putatively unrelated species. You keep spouting the same gibberish. Stop, rethink, concentrate, and try to make sense of your claims.
…is irrelevant to our disagreement.
No, that’s not the standard meaning at all; it isn’t even a meaning of “genetic code”, and your previous statement makes no sense if that’s the meaning you intend.
Nobody was talking about that, and that doesn’t refer to the genetic code.
I’m afraid that you do not.
No, I want you to explain why your theory, whatever it is, is compatible with the observed distribution of Hox clusters among taxa that you think are indpendently created kinds. On the other hand, I think it’s clear evidence that all vertebrates are related by common descent, and that their Hox clusters are inherited from a common ancestor.
I’m not sure I can follow your reasoning here. I offered you to put it in a syllogism to make things easier, but alas, you say you are not good with those, and there is little else you offer in their place. Nevertheless, I feel compelled to ask once again:
If it turned out that what’s called the cosmological constant wasn’t a constant after all, but varied over time, how does that imply anything whatsoever about the existence of “the self-collapse of the universal wave function, which is the causally disconnected choice among a plethora of possible choices or values”? There are many non-constant values in nature. Some we used to think were constant at some point. For instance, the Earth-Sun distance. In the Copernican model, it is constant, Earth’s orbit circular. Was testing for that distance’s variation throughout time and space synonymous with testing for the existence of God? If yes, how does the current consensus on that question alter the strength of your God hypothesis? And if no, then what exactly is so special about the cosmological constant that makes testing for its constancy different from testing for the constancy of Earth’s orbital altitude?
In other words, what is the logical link between your God model and the entailments you rendered? How does God’s existence imply a quantity’s constancy? Why would your model’s falsity imply the same quantity’s non-constancy? Some quantities are (by all accounts) constant, and others are not. Is there a way to use your theory to predict which quantities are going to fall into which category, or do we only get to say the constant ones were predicted after confirming their constancy?
At the risk of sounding perhaps more hostile than I mean to be, cloaking the same argument in sciency jargon is not what makes it a scientific argument. Re-defining God into word salad made of said jargon, likewise, does not make arguments in its favour scientific. You point at a constant, assert that it is finely tuned, call your word salad a “theory”, and assert that it “predicted” that the constant would be a constant long after there was any doubt of it, therefore your “theory” is correct. What you call a philosophical fine-tuning argument would be structurally the same: Point at a constant, assert that it is finely tuned, say that this can only be because God made it so, therefore God made it so. Techno-babble is the only distinctor between your version and the video’s.
Okay, yeah, sure. 10¹²⁰ is a big number… It’s like saying trillion ten times over, like the video did. Now where does “a precise measurement” come in? What was measured here?
Well, it’s a margin of error of about one part in 67. For astrophysics it’s way on the less impressive end of the spectrum, but for physics in general this is pretty good. Not sure where you get the 10⁻²⁹ g/cm³ from, considering Λ has units of inverse area and a value of 1.1 * 10⁻⁵² m⁻², but I do see what the deal with the 10¹²⁰ is, since that’s only two orders of magnitude off from its value in Planck units. That is, it’s value, not the margins of error, of course. They would in the same units of course also be in that ball park, seeing as the relative error is on the order of 1.5% irrespective of unit choice.
Why, thank you, I’ll be sure to use this reference when the time calls for it!
Well, you know, a very recent article actually concluded that Orch-OR was rather quite implausible under a variety of analyzed cases assuming at least a first approximation of gravity-related dynamical collapse. That’s not to say there is no wiggle room left for the theory, but the experimental evidence appears, for now, to be rather unfavourable for it. Here is a recent article on it:
Yes, I’m aware. I merely re-used the same link because it amused me that @Meerkat_SK5 would use one so blatantly unfavourable to their conclusion. It demonstrates once again how apathetic they must be to the discussion and subject matter. Even as little as reading the title of the article they had a trusted source link them to was already more effort than they were ready to expend. I felt like the irony of me re-using it right after passive-aggressively criticizing it for being put down in place of the primary source was rather tame in comparison with citing unambiguously negative test results to back up testability claims in the same breath as one would assert fictitious positive results in the hopes of supporting some grander case.
Sorry – I’ve long since gotten to the stage of considering @Meerkat_SK5’s blather as far too long and far too low signal-to-noise ratio to pay much attention to it. The fact that you’d punctured the balloon of his quantum claims, that he’d been repeating ad nauseam for months did get my attention however.
Alright, let me make some changes to my hypotheses to properly address this.
Universal common designer Hypothesis
A personal agent not only chose the right fine-tuning values for life to exist but chooses the right genetic code for life.
Personal agent: universal self-collapsing genetic code shown by the shared DNA among all living organisms (i.e., objective reduction).
Universal common design hypothesis
All living animals will have a common design that can be traced back to a personal agent. However, the differences between them will be due to the different design requirements that each need for their environment.
Design: To create and develop animals through the process of HRT to survive, reproduce, and pioneer different environments.
Here is why I am defining it this way and you can tell me why this still does not justify defining it that way.
The genetic code is supposed to be the set of rules by which information encoded within genetic material (DNA or RNA) is translated into proteins. These rules define the correspondence between the nucleotide sequence of DNA or RNA and the amino acid sequence of a protein.
Moreover, the genetic code can be thought of as digital information in the sense that it is composed of discrete units (nucleotides) that encode information (amino acids) in a specific sequence. The sequence of nucleotides in DNA or RNA can be seen as analogous to the sequence of 1s and 0s in digital code.
More importantly, the wave-function collapse is a concept in quantum mechanics that describes the collapse of the wave function, a mathematical description of a particle’s state, when it interacts with an observer or a measuring instrument. The collapse of the wave function represents a discrete event, where the quantum system transitions from a superposition of multiple possible states to a single observed state.
Both the genetic code and the wave-function collapse represent discrete moments or units, which justfies defining it as self-collapsing genetic code.
The genetic toolkit and regulatory evolution hypotheses I told you about are compatible with the observed distribution of Hox clusters among taxa.
For instance, Hox clusters are highly conserved among animal species, with some species having multiple clusters while others have only a single cluster. The distribution of Hox clusters among taxa is believed to reflect changes in the organization and regulation of the genome over evolutionary time.
According to the genetic toolkit hypothesis, changes in the expression or regulation of genes like Hox genes can lead to the evolution of new body plans and structures. Therefore, changes in the number or organization of Hox clusters in different taxa could reflect changes in the toolkit of genes that are used to generate these structures.
Horizontal regulatory transfer can help explain how these regulatory genes have evolved and diversified over time because it predicts that changes in the regulatory regions of genes, such as enhancers or silencers, can lead to changes in gene expression and the evolution of new developmental patterns. Therefore, changes in the number or organization of Hox clusters could reflect changes in the regulatory regions of Hox genes or other genes that interact with them.
For example, if a regulatory element controlling the expression of a Hox gene was transferred from one organism to another, it could potentially lead to the formation of new body plans or the development of new cell types. Over time, these new regulatory elements and the genes they control could be further modified and diversified through additional horizontal transfer events or through the accumulation of mutations.
Horizontal regulatory transfer can therefore contribute to the evolution and diversification of the genetic toolkit that controls cell differentiation and the development of body plans. It provides a mechanism for the acquisition of new regulatory elements and the modification of existing ones, allowing for the evolution of new patterns of gene expression and the emergence of new cell types and body plans.
Overall, both the genetic toolkit and regulatory evolution hypotheses are consistent with the observed distribution of Hox clusters among taxa and provide a framework for understanding the evolution of developmental processes and morphological diversity in animals.
We do. It is just in the form of family trees based on anatomical features that contradict family trees based on molecular similarities.
The taxonomic group examined in the article is the eutherian mammals, which is a higher taxonomic rank than the order level. Eutherian mammals include many different orders, such as Primates, Rodentia, Carnivora, and Artiodactyla, among others. The authors likely examined relationships within and between these different orders in their study.
The authors focus on the example of phyllostomid bats, a diverse family of New World bats that has been the subject of multiple phylogenetic studies using different genetic markers and analytical methods. They discuss how these studies have produced conflicting results, with some methods supporting different relationships among species or even different levels of taxonomic classification.
Yes, the authors discussed several factors that can contribute to phylogenetic incongruence, including incomplete lineage sorting, hybridization, and horizontal gene transfer.
HRT involves the movement of genetic material between unicellular and/or multicellular organisms other than by the (“vertical”) transmission of DNA from parent to offspring (reproduction). This is why it can’t be common descent either .
God is considered to be , by definition, a personal and necessary being. These two attributes sum up all the other attributes of God that have been invoked, such as omnibenevolent, omniscience, omnipotent, etc.
The cosmological constant is the only constant that truly demonstrates one of these attributes because the implications of it apply to ANY possible configuration of a universe or life form. This is fundamentally why our universe is considered to be fine-tuned for life by many physicists.
For instance, according to eternal inflationary theory, the acceleration of the expansion rate from inflation is supposed to be produced from an explosion or collision of quantum fluctuations of particles called the “cosmological constant” that permeates the entire multi-verse where a billion (plus one) of positive particles and a billion of negative particles come into existence at once. Finally, these quantum fluctuations emerge from the universal wave-function.
The cosmological constant is placed at a precise measurement of 10 to the 120th power, and when scientists trace the expansion back one second after the Planck scale of our universe, the degree of precision becomes an astounding value of 10 to the 10 to the 123rd power.
Hypothetically, this means that if our universe’s expansion rate had different values with larger amounts of dark energy, the sample size of those universes created in the expansion that try to clump together to form planets and stars, where life of any kind might evolve on (or evolve at all), would have most likely blown the cosmic material apart instead. If our universe’s expansion rate had different values with smaller amounts of dark energy, the sample size of those universes created in the expansion would have most likely collapse back into a singularity before it ever reached its present size.
This suggests that this mind not only exists in all possible worlds or universes but he must exist in them since this conscious agent is the only one that can create and sustain every possible world (i.e. necessary). This means that if this constant were any different, we would not even be able to investigate whether life could have arisen under a different condition or not.
I also like to point out that this constant reveals several other attributes this cause possesses, such as…
Is causally active everywhere (i.e., omnipresent).
Knows the position and momentum of every particle in the universe at a particular moment (i.e., omniscience).
Has the power to accelerate our universe forever (i.e., omnipotent)
Does not have defined locations in space and time (i.e., eternal);
Intended for any kind of life to survive, reproduce, and pioneer different environments (i.e., consciousness)
If you ask, I will explain the logical link of these attributes as well.
I am afraid. It is the latter.
It’s interesting that you say this since the definitions and information I am using to support my theory came from Penrose and Hammeroff research into their Orch-OR Theory. Here, take a look:
One longstanding view is that the act of conscious observation causes superposition to reduce, or collapse, to classical states, that consciousness causes collapse of the wave function. However, this view, termed the Copenhagen interpretation after the Danish origin of Neils Bohr, its early proponent, fails to consider the underlying reality of superposition, and puts consciousness outside science. But rather than consciousness causing collapse, as in the Copenhagen interpretation, Sir Roger Penrose has taken the opposite approach, suggesting that collapse causes consciousness (or is consciousness), a process in fundamental spacetime geometry, the fine scale structure of the universe, each OR event a qualia moment of subjective experience.…
… In the Copenhagen interpretation, postcollapse states selected by conscious observation are chosen randomly, probabilistically (the Born rule, after physicist Max Born). However in Penrose OR the choices (and quality of subjective experience) are influenced by-resonate with- what Penrose called noncomputable Platonic values embedded in the fine scale structure of spacetime geometry. These Platonic values, patterns, or vibrations in the makeup of the universe, may encode qualia, and pertain to mathematics, geometry, ethics, and aesthetics, and the 20 or so dimensionless constants governing the universe. These include the fine structure constant, the mass ratios for all fundamental particles, the gravitational constant and many more, all precise to many decimal points. If these numbers were slightly different, life and consciousness -at least as we know them- would be impossible. [emphasis added] Ch20-9780124201903_aq 1…1 (galileocommission.org)
The phrase “a precise measurement of 10¹²⁰” refers to a hypothetical value that is incredibly large, roughly equal to the estimated number of particles in the observable universe. This number is so large that it is difficult to measure or conceptualize directly.
However, It’s important to note that this value is not the result of an actual measurement, but rather a theoretical calculation based on various assumptions and estimates about the size and composition of the universe.
Therefore, I acknowledge that the phrase “a precise measurement of 10¹²⁰” is somewhat misleading, as it implies that a direct and accurate measurement has been made, when in fact it is a theoretical estimate based on current scientific knowledge.
I knew about this already. The reason why I sent it anyway was because it refuted a different model than the one Penrose proposed. See the fine print…
“While both Penrose and Diósi arrived at the same simple formula for the timescale over which this type of collapse would occur, their individual models differ. Penrose did not specify the dynamics of wavefunction collapse, whereas Diósi provided a full dynamical description. In doing so Diósi predicted that collapse should be accompanied by the emission of electromagnetic radiation – generated by charged particles within the system as they undergo a continuous Brownian motion related to the collapse mechanism.”
For more, I encourage you to watch the interview where his co-partner Hammeroff explains why it does not refute it but actually supports it. He speaks about it at the 1 hour mark of the video:
You present a set of unrelated trusims — a correct if long-winded definition of “genetic code” and the wondrous wisdom that a wave-function collapse is the collapse of a wave function. Together they mean nothing.
You address nothing. There is no reason to expect design requirements to fit a nested hierarchy.
They are not. And again you spout pointless verbiage, combining the obvious with the nonsensical to no result.
“Highly conserved” makes sense only in a phylogenetic context that you reject.
Yet another non sequitur. And the next paragraph is yet another.
This suggests that you have no idea what a Hox cluster is or what it means to have several of them. You are merely embarrassing yourself here.
This too shows that you have no idea what you’re talking about.
So you admit that you haven’t read the paper.
That doesn’t answer my question. You haven’t read the paper, have you?
All of that is perfectly within the bounds of common descent.
Why what can’t be common descent?
I can’t figure out whether you really think you’re making an argument or are doing your best to obfuscate.
Not true. It is based on observations that you still did not address. Here it is again…
Every living creature on Earth uses the same code: DNA stores information using four nucleotide bases. The sequences of nucleotides encode information for constructing proteins from an alphabet of 20 amino acids. Why were these numbers chosen rather than some other numbers?
Patel showed how quantum search algorithms explain why these numbers were chosen. [22]
To summarize, if the search processes involved in assembling DNA and proteins are to be as efficient as possible, the number of bases should be four and number of amino acids should be 20.
An experiment has revealed that this quantum search algorithm is itself a fundamental property of nature. [23]
From this, we can infer that a conscious agent chose those numbers among many possible numbers to develop organisms (i.e. self-collapsing genetic code)
What is your response?
It is not the design requirements themselves that predict nested hierarchies, but the purpose or teleology behind them. If the purpose is to make sure groups of created kinds to fit and fill different environments around the globe, we would expect nested hierarchies.
We would expect to see nested hierarchies emerge when similar parts and functions are adapted to fit and fill different environmental niches. This is because the process of adaptation to different niches often involves the modification and specialization of existing traits, rather than the evolution of entirely new ones.
As a result, we can often trace the evolutionary history of a group of organisms by examining the nested hierarchy of shared traits that reflect their common design. For example, the nested hierarchy of shared anatomical features and genetic sequences provides evidence for convergent evolution. The resulting similarities in structure and function can create nested hierarchies that reflect convergent evolution across multiple lineages.
Overall, the use of similar parts and functions to fit and fill different environmental niches can create nested hierarchies that reflect convergent evolution across multiple lineages, providing a powerful framework for understanding the history and diversity of life on Earth.
I guess I am going to have to rely on Stuart Hammeroff’s model:
It suggests that neurons and other cells fused together by gap junctions, which allowed for chemical signaling to occur at synapses between axons, dendrites, and soma. This chemical signaling was facilitated by the unique arrangement of microtubules within the cells, which allowed for optimal integration, recurrent information processing, interference beats, and orchestration of Objective Reduction or self-collapse to happen.
As neurons formed networks, a critical point of around 1011 tubulins in 300 neurons or axonemes in simple worms and urchins, the time interval between quantum events (t) became brief enough to avoid random interactions, which prompted the Cambrian evolutionary explosion.
This creation process of the same 4 hox clusters continued to happen for created kinds.
Now, it is up to you to this not a possible scenario?
What makes you say that? And why is this relevant to your objection? Explain
According to Theobald, transmutation can be confirmed through phylogenetics if morphology and molecular sequences are congruent.
He also pointed out that family trees based on anatomical features that contradict family trees based on molecular similarities would falsify the claim that nested hierarchies are universal.
This is why I quoted “the persistent failure of a single tree of life to emerge makes perfect sense: there is no evolutionary tree of life, because common descent isn’t the case. A common designer is.”
Yes, I read the first one and the other one I just read the abstract because of the paywall.
The process of evolution by the mechanism of HGT, which includes HRT and BGT.
Again, Veritical and horizontal transfer are two different processes.
Please, explain. What does “a precise measurement of 10¹²⁰” mean? How is the cosmological constant “placed at” it? What do you mean by degree of precision, and how can it “become an astounding value of 10^(10¹²³)”? What do these statements mean in practice, i.e. how would we go about testing them?
I shall not. I asked you to explain the logical link between your “theory” and the “predictions” you rendered. Then I asked again. Then I asked again. You elected to never do this. I have no interest in addressing your philosophical ramblings, not after you said that your brand of fine-tuning was intended to be scientific instead.
Very well. Let the record show, then, that, despite your instistence on this being a scientific discourse, the one thing that is the point and purpose of scientific models is the one thing you admit yours is unsuited for.
You already said pasted that.
Fine. What is it a theoretical calculation of then? We both agree it’s not the value of the cosmological constant, since we both agree it actually has dimension and is no raw number like that. What is 10¹²⁰ the value of, then? What’s been calculated here, aside from “a big number”?
An estimate of what? And based on what knowledge, exactly? I’m not even asking you to show me the math or anything. I’m merely asking to state what on earth you are even talking about.
It refers to a hypothetical scenario where the value of the cosmological constant is known with extremely high precision. The cosmological constant is a fundamental parameter in cosmology that represents the energy density of the vacuum of space and plays a key role in determining the expansion rate of the universe.
A precise measurement of 10¹²⁰ would imply that the value of the cosmological constant is known to an incredible degree of accuracy, with an uncertainty of only 1 part in 10¹²⁰. This level of precision would require significant advances in observational and theoretical techniques, and would likely require the use of next-generation telescopes and other advanced instrumentation.
The cosmological constant is not “placed” at a value of 10¹²⁰ or any other specific value. The value of the cosmological constant is determined through a combination of observational and theoretical methods.
In practice, scientists can test the value of the cosmological constant through a variety of observational and theoretical methods. Observationally, they can use data from cosmic microwave background radiation, the large-scale distribution of galaxies, and the brightness and redshift of distant supernovae to infer the value of the cosmological constant. Theoretical models of the universe, including the equations of general relativity, can also be used to predict the value of the cosmological constant based on various assumptions about the nature of dark energy and the vacuum of space.
To improve the precision of the measurement, scientists are constantly developing and refining their observational and theoretical techniques. This includes using advanced telescopes and other instrumentation to gather more precise data, as well as developing new theoretical models that take into account the latest developments in fundamental physics.
But, my particular hypothesis and predictions come from a biological context. Let me summarize the argument again.
Based on the well-tested and supported quantum mind theory, the definition of consciousness is self-collapsing wave function. or " causally disconnected choice". Now, here is the formation of the hypothesis:
Observations
The natural engineering principles of electron tunnelling in biological oxidation-reduction involve optimizing the tunneling distance and energy barriers, organizing the electron transport chain, maintaining quantum coherence, and creating a robust system that can operate in a variety of conditions. These principles are essential for the efficient transfer of electrons in biological systems, which is crucial for many biological processes, including cellular respiration and photosynthesis.
Patel showed how quantum search algorithms explain why the same code was chosen for every living creature on Earth. [22] To summarize, if the search processes involved in assembling DNA and proteins are to be as efficient as possible, the number of bases should be four, and the number of amino acids should be 20. An experiment has revealed that this quantum search algorithm is itself a fundamental property of nature. [23]
We can infer that a personal agent not only chose the right fine-tuning values and genetic code to create and develop life on earth.
Definition of personal agent: universal self-collapsing genetic code shown by the shared DNA among all living organisms (i.e., objective reduction).
Predictions
We should find non-random mutations in the non-coding regions of the genome, which have been labeled “junk DNA”, and in regions that do encode proteins but are primarily deleterious.
Both predictions have been and continue to be confirmed:
For strictly physics, yes but not biology.
It is a theoretical calculation and value of dark energy.
The value of dark energy has been estimated through a combination of observational and theoretical methods, with the current best estimate being around 10⁻³⁵ s⁻², with an uncertainty of about 10%. This estimate is not related to the number 10¹²⁰ mentioned in the original question.
There are some theoretical models of the universe that predict much larger values of the cosmological constant, with some estimates reaching up to 10¹²³ or higher. However, these predictions are based on various assumptions and are subject to ongoing testing and refinement through observations and theoretical modeling.
The cosmological constant is a parameter in the equations of general relativity that represents the energy density of the vacuum of space. Its value has been estimated based on a combination of observational and theoretical methods.
Observationally, the value of the cosmological constant has been inferred from the observed accelerated expansion of the universe. This acceleration is thought to be caused by a repulsive force associated with dark energy, which is often identified with the cosmological constant. Observations of the cosmic microwave background radiation, the large-scale distribution of galaxies, and the brightness and redshift of distant supernovae have all been used to constrain the value of the cosmological constant.
Theoretically, the value of the cosmological constant has been predicted based on various assumptions about the nature of dark energy and the vacuum of space. For example, some models propose that the cosmological constant arises from the energy associated with virtual particles that briefly pop in and out of existence in the vacuum of space. Other models propose that the cosmological constant is related to the properties of the Higgs field, which gives mass to other particles.
Combining these observational and theoretical constraints, the current best estimate of the value of the cosmological constant is around 10⁻³⁵ s⁻², with an uncertainty of about 10%.
It seems very likely that @Meerkat_SK5 has recently decided to run every reply through ChatGPT. I don’t think he’s even writing these himself. Note how everything begins by restating the obvious.