Somewhere in the multiverse, dark energy is helping stars and life form

The articles said:
"But dark energy isn't the only parameter in the universe that seems friendly to support life. The speed of light, the charge of an electron, the mass of a proton, the strength of gravity and the strong force, among others, all seem finely balanced. Were their values even slightly different, the universe would be a changed and more inhospitable place, with stars and life unable to form and develop. What’s more, we don't know why they have the values that we measure. What we do know is that if they didn't have those values, then we wouldn’t be here to measure them."

If we take the speed of light for example Given a spherical shape (for the universe) it can only be what it is - a ratio of 1 distance to 1 time. A very fine balance.
Maybe there's a lesson there
 
It is an anthropic multiverse update, but like earlier works they are still 2-3 orders of magnitude away and notes that their model isn't fully conclusive, they need better models. They concern themselves with strict but arguable hypothesis testing, but the theory is good enough compared to the alternatives (such as unlikely finetuning). The arguable limit here could be 5 sigma for tension, as we see used in Hubble rate estimates.

The original paper noted that dark energy has a high prior as it should be around Planck energy, but that the habitability requirement push it down 120 orders of magnitude and not much more. This is against a theoretical background of an infinite number of lower orders of energy scale.

If we take the speed of light for example Given a spherical shape (for the universe) it can only be what it is - a ratio of 1 distance to 1 time. A very fine balance.
That is not how it looks when you figure big bang space expansion into i.
https://en.wikipedia.org/wiki/Scale_factor_(cosmology)

And it has nothing to do with finetuning.

This is all Fairy Tales for Physicists.
If it was, they couldn't do science in it. The anthropic multiverse has been researched since the 80s, and this paper show the area advances.
 
The argument goes: The current ratio of time to distance being 1:1 is for the reference time \(t_0\), meaning we're looking at how things are today. But as you go back in time, the scale factor \(a(t)\) is assumed to be less than 1. Distances between objects were smaller and it is assumed that time was as it is now so that the ratio of time to distance wasn't always 1:1

The error is the assumption that time is parallel (as might be in some flat universe models). If the universe is closed and time still was orthogonal to space then time was of the 1:1 ratio applied previously as it does now. Space and time proportionally ' moving together'.
The speed of light is just a marker/symptom of expansion. By a higher dimensional imaginary time, the development of the universe may well speed up or slow down but our time ratio to distance remains as is i.e. the speed of light in our spacetime always has been the same.

Unless you free yourself from the general, somewhat lazy idea that time is a mysterious thing not understood and start thinking outside the box then these ideas will not be understood. In many posts here I have suggested exactly what time is. To understand it is necessary to (by analogy) think of stuff in at least 4 spatial dimensions as a starter and in our special 4D spacetime as an n-sphere.

Admittedly, the numbers that the balance of an open v closed universe are close and debated however the reality of time in a closed universe is ignored by many cosmologists.
 
Last edited:
And it has nothing to do with finetuning.
If you refer to my comment I did not mean it in the sense that a margin of error was relevant, I was referring to the numbers required to 'close' the universe. My poor wording and lack of explanation.
My comments above should clarify and explain why I believe we are being led by the nose up the garden path eyes closed :)
 

Jzz

May 10, 2021
219
63
4,660
Visit site
This is a very interesting discussion indeed. Studies done by D.A Howell in his paper: "The Type Ia Supernova Rate," Nature, 2001, which discusses the frequency of Type Ia supernovae in galaxies. I am not entirely conversant with the contents of this paper but on the other hand using a fairly simple calculation based on the rate of type 1a supernova in the milky way a fairly accurate estimate can be made of the frequency of type 1a supernovae events in the Universe:

Rate of Type Ia Supernovae in the Universe ≈ 500 years = 1 supernova per galaxy × 100 billion galaxies / 500 =200,000 supernovae per year. This seems to be extremely good news for the proponents of a Universe that is expanding at superluminal rates. While 200,000 Type Ia supernovae may occur annually in the observable Universe, not all of them will be detectable by human astronomers, and only a fraction will be observable with spectroscopic analysis.

Due to brightness and distance limitations: Supernovae in galaxies closer to us (within a few hundred million light-years) are much more likely to be detected spectroscopically than those at greater distances. For instance, supernovae in the Local Group of galaxies (including the Milky Way and nearby galaxies like Andromeda) are most likely to be observed in detail. As of recent surveys, about 50-100 Type Ia supernovae are detected spectroscopically per year in the observable Universe. This number is lower than the total number of supernovae detected (200,000) because spectroscopic follow-up requires significant resources, and not all detected supernovae are bright enough or close enough to be observed in detail.

Spectroscopic analysis is usually not the initial method used for detecting supernovae. Instead, it’s typically done after a supernova has been spotted using photometric methods (color filters). Once a transient event (like a supernova) is detected, telescopes with spectrographs follow up to obtain detailed spectra. Out of the 10-20 supernovae followed up spectroscopically, only those within a few hundred million light-years are likely to have high enough brightness and resolution for accurate validation and classification.

Typically, the number of supernovae validated spectroscopically will be lower than the initial follow-up count because many supernovae will be too faint or distant for a high-quality spectrum.

So how does this effect Dark Matter? If dark energy were not present, dark matter would represent approximately 84.4% of the total matter in the Universe.

Percentage of Dark Matter=Total Matter/Dark Matter×100 = 32/27 × 100 ≈ 84.4%

Note that using the AND Theory, the energy content would be significantly affected giving an estimate for Dark Matter as > 84.4 %
 

Jzz

May 10, 2021
219
63
4,660
Visit site
I have opted for a new post instead of editing the earlier one which was already quite lengthy. A type 1a supernova typically achieves peak brightness over a period of 10 to 23 days. The HST needs an exposure on the order of 50 days to record light from objects two or three billion light years distant. The point is can such sweeping statements as "the universe is expanding at ftl speeds" be so widely accredited. Is it good science to credit such an important process as having a factual basis on what amounts to manufactured evidence ? So much so that it can lead more factually based evidence such as the existence of dark matter and the percentage of it present in the Universe to a false result ? It is not morally correct. While objects that are billions of light years distant may be accurately depicted, the pinpointing of type 1a supernovae at those distances is dubious in the extreme and to base cosmic expansion theories on such non-existent evidence needs a rethink.
 
Last edited:
'Aethereal fine' coarse grain chunky from 'aethereal fine' smooth as silk, to 'aethereal fine' smooth as silk from 'aethereal fine' coarse grain chunky . . . on and on, offset parallel! From Chaos Theory, the coarse grain chunky plane is fundamentally in the smooth as silk plane though topping, the smooth as silk fundamentally in the coarse grain chunky plane though topping (sic). The Trojan dimension to the fundamental binary base2 'set' opposing equality: "aethereal fine" ('1') ('Unity').
 
Last edited:

TRENDING THREADS

Latest posts