'Standard model' of cosmology called into question by new measurements

The space.com report calls attention to an important point here.

"There has been debate over whether this problem lies in the model itself or in the measurements used to test it," Braatz added. "Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem."

My note, H0 = 74 plugged into the cosmology calculator, http://www.astro.ucla.edu/~wright/CosmoCalc.html, the flat universe is 12.905E+9 years old, open model is 10.746E+9 years old. Much is at stake with these different H0 values now. The Hubble time for the age of the universe can be reduced quite a bit from the conventional 13.8E+9 years old universe commonly reported.

"Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision," said Braatz., ref, https://phys.org/news/2020-06-distance-bolster-basic-universe.html

Looks like the LCDM model used in BB cosmology needs more constraints now, including the age of the universe, the Hubble time.
 
  • Like
Reactions: Craftsman X
The word "acceleration" wasn't used in the article. With an accelerating universe, wouldn't the redshifts indicate today that we are expanding faster than we were in the past, so that an effective expansion rate would be less after all, right?
 
Dec 11, 2019
6
3
15
Visit site
The word "acceleration" wasn't used in the article. With an accelerating universe, wouldn't the redshifts indicate today that we are expanding faster than we were in the past, so that an effective expansion rate would be less after all, right?
I agree!
But that's the key point that support the "Dark Matter" cult...
I find funny that it is easier to believe in Dark Matter than that the Universe (or part of it) it is retracting into the original singularity (Big Bang or multiple Big Bangs)...
 
This comes over to the naive reader as a sort of ridiculous paper of the Reiss collaboration, the collaboration that pushes the “the standard model needs revision” based on thin gruel. For one, the work has low statistical power of few galaxies. For another, it is a really complex modeling they improve slightly on.

But foremost these galaxies are really close and not telling us much.

Meanwhile, as recently mentioned here, the “Most Precise Tests of Dark Energy and Cosmic Expansion Yet Confirm the Model of a Spatially Flat Universe” and also promise to explain what is going on [ https://scitechdaily.com/most-preci...nfirm-the-model-of-a-spatially-flat-universe/ ]:

“The study makes use of data from over a million galaxies and quasars gathered over more than a decade of operations by the Sloan Digital Sky Survey.

The results confirm the model of a cosmological constant dark energy and spatially flat Universe to unprecedented accuracy, and strongly disfavour recent suggestions of positive spatial curvature inferred from measurements of the cosmic microwave background (CMB) by the Planck satellite.”

“”We see tentative evidence that data from relatively nearby voids and BAO favour the high Hubble rate seen from other low-redshift methods, but including data from more distant quasar absorption lines brings it in better agreement with the value inferred from Planck CMB data.”

That is statistical support of 6 galaxies within z = 0.03 vs 1,000,000+ galaxies within z = 2.3 (or half the universe age).

The new void measurement gives the current expansion rate of 69 +/- 1.2 km s^-1 Mpc^-1 so is < 2 % imprecise. It is using BAO as a ladder independent distance ruler. "BAO matter clustering provides a "standard ruler" for length scale in cosmology." [ https://en.wikipedia.org/wiki/Baryon_acoustic_oscillations ] That also means it likely – though not assuredly – lies within 70 km s^-1 Mpc^-1, which IIRC is the limit where the Planck collaboration suspected that we needed new physics.

The new maser measurement is twice as imprecise, to add to the low statistical support. And it is still using local distance ladder type measurements, which we know have 5-15 % errors for nearby galaxies [ https://en.wikipedia.org/wiki/Cosmic_distance_ladder ]. That seems to be the problem.

We'll see how further measurements and reception among involved scientists turn out. But the void measurement gives me hope that we may not have to wait for long until the question is more satisfactorily resolved.
 
"Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision," said Braatz., ref, https://phys.org/news/2020-06-distance-bolster-basic-universe.html

Looks like the LCDM model used in BB cosmology needs more constraints now, including the age of the universe, the Hubble time.

The current open question is a tension between the mostly low z local universe observations of current expansion rate and the mostly high z distant universe observations. The integrative observations over several measurement types also scatter over that tension, but there is also here a tendency that the global integrative observations prefer a lower expansion rate. The local universe measurements often rely on the cosmic distance ladder, which is an iffy construction - see more in my longish comment here.

This happens on a context where - before the supernova cosmic background ladder measurements, and later the cosmic background spectra observations - there was an open question about which type of cosmology applied and the universe age estimates differed with a factor 2 [!]. For the younger ages there were stars that from astrophysicist modeling appeared to be older than the universe, which is a much more serious tension. As I often note, the LCDM cosmology solved these problems and is self consistent - apart from this single remaining open tension which did not appeared until the last few years at that. Because historically, as in similar cases such as measuring the universal speed limit, early observations are iff and have a lot of bias. (C.f. how, famously, the modern value of the universal speed limit lies outside the initial range.) See the image here: https://sci.esa.int/web/planck/-/60504-measurements-of-the-hubble-constant .

The simplest explanation is that something similar is going on here, though we should of course mind that there are other explanations. The current most precise and statistically well supported expansion rate measurement - see my longish comment - indicates that the expansion rate "likely – though not assuredly – lies within 70 km s^-1 Mpc^-1, which IIRC is the limit where the Planck collaboration suspected that we needed new physics." Only if we get a higher rate, would LCDM need to be modified and, yes, becoming less simple. The age of the universe would not move so much that stars would seem older than the universe, I think - LCDM is "morally" right (simple but explaining so much; having no real contenders), even if details are under scrutiny.
 
The word "acceleration" wasn't used in the article. With an accelerating universe, wouldn't the redshifts indicate today that we are expanding faster than we were in the past, so that an effective expansion rate would be less after all, right?

Technically the H0 [H_0] Hubble constant stands for the rate at the current [time t=0 in the astronomer frame of reference, i.e. "now"]. It applies in the equation v = H_0*D, where D is the "proper distance" to a sufficiently distant galaxy you look at and v is the separation speed between Milky Way and that galaxy [ https://en.wikipedia.org/wiki/Hubble's_law ].

When we apply that to the universe we find that the expansion of a general relativistic universe depends on the inner state of the universe, so the Hubble "constant" is actually a function H(a) where a = scale factor of the expansion [ https://en.wikipedia.org/wiki/Lambda-CDM_model#Cosmic_expansion_history ]. The current expansion behavior is very simple since dark energy, a constant energy density, dominates the inner state, and we get an exponential expansion since H(a) = H_0.

I think what you call that exponential the "effective expansion rate". Astronomers takes great care in extracting H_0 out of the former expansion history. But of course, if they don't understand the differing behaviors sufficiently they will run into ambiguities.

[The essential behavior is not general relativistic but newtonian physics of "a thrown ball" - the inner state of a matter dominated, galaxies "only" universe, approximates a classical throw parabola. That was in fact the expansion behavior after the Dark Ages and before the Dark Energy domination we live in now. Before that you had of course radiation domination which differed somewhat (since radiation stretches when the universe expands) and further back another exponential expansion physics, just a much faster - inflation. I recommend Susskind's first Stanford MOOC for a graduate level description.]
 
Maybe the problem is in the gravitational portion that their "standard model" doesn't include.

Is the "standard model" of cosmology the same as the "standard model" of particle physics?

The first question is hard to respond to, because if new physics is needed (and we don't know that) it affects the general relativistic physics that our cosmology is. The gravitational portion is built in, and changes in that is more difficult to envision than changes in the current components. For example, different expansion rates may depend on new, temporary matter particles in the early universe (and see my response to Helios that describe some of the physics behind expansion rates).

The terminology is appealing to cosmologists, who covers both since high energy physics implies particle physics to various extent. There are other names, but essentially they describe theories that are well tested today, but which has some obvious or threatening remaining tensions between observations and/or theory - expansion rate for LCDM "standard model" (statistically significant) and lepton flavor universality for Higgs "standard model [ https://atlas.cern/updates/physics-briefing/addressing-long-standing-tension-standard-model ] (not sufficiently significant, just long standing).
 
But that's the key point that support the "Dark Matter" cult...

I find funny that it is easier to believe in Dark Matter than that the Universe (or part of it) it is retracting into the original singularity (Big Bang or multiple Big Bangs)...

Key point? "Cult"?

"... offers a comprehensive explanation for a broad range of observed phenomena, including the abundance of light elements, the cosmic microwave background (CMB) radiation, large-scale structure, and Hubble's law ... now universally accepted." [ https://en.wikipedia.org/wiki/Big_Bang ]

Singularity?

View: https://www.youtube.com/watch?v=P1Q8tS-9hYo
 
FYI, some here are hand waving against the new values for H0. 'The Megamaser Cosmology Project. XIII. Combined Hubble Constant Constraints', https://iopscience.iop.org/article/10.3847/2041-8213/ab75f0, "Abstract We present a measurement of the Hubble constant made using geometric distance measurements to megamaser-hosting galaxies. We have applied an improved approach for fitting maser data and obtained better distance estimates for four galaxies previously published by the Megamaser Cosmology Project: UGC 3789, NGC 6264, NGC 6323, and NGC 5765b. Combining these updated distance measurements with those for the maser galaxies CGCG 074-064 and NGC 4258, and assuming a fixed velocity uncertainty of 250 km s−1 associated with peculiar motions, we constrain the Hubble constant to be H 0 = 73.9 ± 3.0 km s^−1 Mpc^−1 independent of distance ladders and the cosmic microwave background. This best value relies solely on maser-based distance and velocity measurements, and it does not use any peculiar velocity corrections. Different approaches for correcting peculiar velocities do not modify H 0 by more than ±1σ, with the full range of best-fit Hubble constant values spanning 71.8–76.9 km s^−1 Mpc^−1. We corroborate prior indications that the local value of H 0 exceeds the early-universe value, with a confidence level varying from 95% to 99% for different treatments of the peculiar velocities."

There is a real problem with measuring H0 using the CMB to support inflation and LCDM and comparing H0 values determined by other methods in astronomy today. The Hubble time is very different too as I showed, even using the flat universe model. This is the real problem in cosmology when it comes to different values reported for H0, this has been a problem since the days of Edwin Hubble.
 
Quoting from the article - James Braatz (of the National Radio Astronomy Observatory):

"There has been debate over whether this problem lies in the model itself or in the measurements used to test it," Braatz added. "Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem."

(Say it aint so, James!!) :)

Just to pile on, here is more of the same from another group (previously posted by me from another thread). These recent calculations based on direct observations are not new, just the newest. Let's not forget what those people from the University of Chicago also found with data from HST:


So with data from "direct observations", the Hubble Constant is coming in around 70-74ish. Likely this range will narrow with more observations. At any rate, it seems likely that the direct empirical data from observatories should be more precise than the previous calculations relying on 'you-know-what'. (That is how it works in most of the other sciences, at least.)

It would appear that some people need to get out of their boxes and see if there is anything new to consider.

Looks like the CMBR data needs some more re-re-re-evaluation. Again. When will it ever end?!

With the observational data, it would seem. Bring on the telescopes!!
 
Last edited:
  • Like
Reactions: Catastrophe
For an excellent overview on the current state of the theories about the BB, and dark matter and energy, take some time and carefully read through the story below*. It is directly related to the story of this thread, and a whole lot more.

It is written by Dan Hooper, a brief on him was stolen from Wiki and posted below:

Daniel Wayne Hooper is an American cosmologist and particle physicist specializing in the areas of dark matter, cosmic rays, and neutrino astrophysics. He is a Senior Scientist at Fermi National Accelerator Laboratory and an Associate Professor of Astronomy and Astrophysics at the University of Chicago. (probably not a half-wit)

* https://astronomy.com/magazine/news/2020/05/is-the-big-bang-in-crisis
 
  • Like
Reactions: Catastrophe
May 3, 2020
59
11
4,535
Visit site
Using new distance measurements, astronomers have refined their calculation of the Hubble Constant, a number that describes how fast the universe is expanding at different distances from a specific point in space.

'Standard model' of cosmology called into question by new measurements : Read more
I notice the article also says galaxies are nearer then possible under BBT. Using VLBT etc. My guess is that current estimates of expansion distances are flawed anyways. They assume expansion and then calculate distance. That's not science. Hopefully sooner rather than later a parallax measurement maybe from a combination of horizon and pioneer with earth based observations will give us a true distance. And give BBT the boot.
 
  • Like
Reactions: dfjchem721
It is not clear whether BBT needs to get the boot. Much of the details people have tried to tease out of the limited data seems like so much sheer speculation. And which is now clearly in a state of increasing flux due to the observational data, which is only going to increase. The more reliable constraints which are established from the telescopes should help resolve some things. Precise modelling of such complex phenomena, and they don't get more complex, surely has its limits (as we are "seeing").

But I think the basic theory is sound if for no other reason than the designation of Hubble's Law*, and the Hubble Flow, are based on direct observation. The expansion of the universe clearly appears to indicate a BB of some kind. How it all went down is the Big Question (BQ?). The time lines, and most particularly all of the things that came out of the BB, like matter (+ antimatter?), ER, dark matter, dark energy, etc., will likely need some revision(s). Nobody (should have) ever said this was going to be easy!

And keep an eye on the LHC data. There are some unique observations coming out of there that could tidy things up, or make it all even less certain.


Time will tell...............or not!


* https://en.wikipedia.org/wiki/Hubbles_Law
 
Last edited:
  • Like
Reactions: Catastrophe
Maybe we should name the whole thing "The Self-Standard Model", and let everyone define their own model. Seem to be going in that direction anyway.......

With this approach, in the absence of supporting data, we can all be right!
 
  • Like
Reactions: Catastrophe
FYI, some here are hand waving against the new values for H0. ...
There is a real problem with measuring H0 using the CMB to support inflation and LCDM and comparing H0 values determined by other methods in astronomy today. The Hubble time is very different too as I showed, even using the flat universe model. This is the real problem in cosmology when it comes to different values reported for H0, this has been a problem since the days of Edwin Hubble.

Ah, I was intending to respodn to this earlier but run out of time. First, since I know you are interested in a putative "missing mass" problem of forming solar system, a new work points to a reasonable physics - planets generally forms within one million year, so observing older disks will see a deficit which already went into planets.

"Baby planets are born exceptionally fast, study suggests Planets are forming around young stars far faster than scientists expected, arising in a cosmic eye blink of less than half a million years, according to a new study. That finding could inform models of planet formation and help resolve a problem plaguing astronomers since 2018, when data indicated that planetary nurseries contained far too little material to actually create planets."

"To find out how much material is available for planet formation, researchers have used the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile to weigh the disks around young stars between 1 million and 3 million years old. Past studies found that some lacked the mass to form even a single Jupiter-size world. The results suggested astronomers were either overlooking some hidden reservoir of material or they were looking too late in the planet-forging process, after growing protoplanets had already vacuumed up much of the material.

The answer, says Łukasz Tychoniec, a graduate student at Leiden Observatory and lead author of the new paper, is that “we need to look earlier instead of [looking] for missing mass.”

Along with his colleagues, Tychoniec used images from ALMA and the Very Large Array (VLA) in New Mexico to study 77 protostars ..."

[ https://www.sciencemag.org/news/2020/06/missing-mass-planet-formation-found-young-disks-gas-and-dust ]

The planet formation time is not many orders of magnitude away from the rapid formation times proposed to predict the characteristics observed for smaller outer system bodies in the new “hot start” scenario [ https://www.universetoday.com/14665...-slowly-freezing-solid-for-billions-of-years/ ].

Second, I'm not sure what you mean with "new values" since all mentioned values is within the usual span, nor what you mean with "this has been a problem" since the expansion rate closed in the 90s - earlier values differed with a factor 2 - only to recently diverge in some observations.

See my comment on how the most precise measurements now see the low-z values in some measurements go away when adding data and types of observations.

To add to that, it has now been shown that the modern tension can equally well be a tension in the cosmic background spectra temperature. And here again the tension goes away with measuring the temperature and/or relying on high-z observations
[ https://astrobites.org/2020/06/27/h0-or-t0-tension/ ].

Same as the temperature, the expansion time is connected to how you make these observations.
 
OOOPS !!!

(Looks like all that certainty just went down the cosmic drain.)

Statement context is missing. But yes, the "called into question" is neither erroneous nor very supportable from the work in the article.

So with data from "direct observations", the Hubble Constant is coming in around 70-74ish.

The paper I pointed to show that the direct observation from the cosmic background comes in at 67 kms^-1 Mpc^-1, so the span is the full 67 - 75 (or more) tension. The precision data comes in at 69 -70 kms^-1 Mpc^-1, meaning that the problem lies in some observation sets (no new physics needed).

It's all very well to suggest extraordinary physics, but such need extraordinary evidence. Here we don't even see any robust evidence for anything new...


Hooper is a particle physicist that likes to suggest all sorts of particles (say, the eponymous hooperon) which regularly fails. The clue is in the title: "some astronomers" [likely meaning cosmologists]. It's mostly a rehash of the usual complaints of lack of progress (despite tremendous progress!).

Going from a recent tension, which looks like observational problems (but could be something else) to "crisis" is the Hooper type of hope.

It is not clear whether BBT needs to get the boot. Much of the details people have tried to tease out of the limited data seems like so much sheer speculation.

Expansion is not speculation. And the full model is precise to 1 %, which is far above the tens of percent imprecision of volume chemistry models, say (seeing your handle).

Yet chemistry works.

[I merged several comments, I hadn't realized the thread evolved into sort of trolling the consensus accepted cosmology.]
 
Last edited:
I notice the article also says galaxies are nearer then possible under BBT. Using VLBT etc. My guess is that current estimates of expansion distances are flawed anyways. They assume expansion and then calculate distance. That's not science. Hopefully sooner rather than later a parallax measurement maybe from a combination of horizon and pioneer with earth based observations will give us a true distance. And give BBT the boot.

Redshift shows the universe is expanding. Scientists have tried to poke holes in big bang expansion for over a century - it can't be done.
 
I suggest that Hubble's (with or without Lemaitre) Law is NOT a law in the scientific sense and should be scrapped.

There seems little doubt you will be vilified for such blasphemy. Do they still burn people at the stake?

On what scientific principle(s) do you base this claim? The law is based on very sound empirical evidence.

(Can't say as I agree with this scrapping of a physical law, but it is a burnable offense to some people.)
 
  • Like
Reactions: Catastrophe

Latest posts