No, the Big Bang theory is not 'broken.' Here's how we know.

Page 4 - Seeking answers about space? Join the Space community: the premier source of space exploration, innovation, and astronomy news, chronicling (and celebrating) humanity's ongoing expansion across the final frontier.
The first point to note is that redshift in General Relativity(GR) is in fact model dependent because there are multiple ways which light can become redshifted. The rate of expansion of space, the curvature of space and even variations in diffuse matter densities all have an impact on how light can become redshifted and these are not trivial.
Very nice post! If you don't mind, I would like to question my own viewpoints that are more historical than otherwise:

I was surprised to learn that one of the first solutions to GR (de Sitter) was a simplified model that used no matter in the universe showing redshift in a static universe, which was the mainstream model at the time, though given the lack of knowledge for other galaxies, that was understandable.

Since Hubble and deSitter worked closely together at one or more conferences, I suspect that this early, but rejected, model affected Hubble since he avoided ever claiming that redshift was due to the expansion of spacetime. He said he would leave theoretical views to the theorists. [I've never read an author connecting these dots, though many authors seem convinced Hubble discovered expansion. :)]

It seems that Doppler explained redshift well-enough as the early assumption starting with Lemaitre, though his model had space carrying the "extragalactic nebulae", as Hubble always called them. But the Doppler fails when redshifts demonstrate speeds > c, hence came the term "cosmological redshift". Doppler still refers to their peculiar motions, of course.

GR is tricky because without prior assumptions to simplify the mathematics the Einstein field equations like pretty much every other known system of partial differential equations are prone to natural irreducible chaotic behavior in this case affecting the evolution of the metric. This is because differential equations by definition must have a single unique solution for each and every possible valid set of initial conditions.
Yes, and this is why I think it best to explain BBT to the general public by not starting with any initial condition, though many like to add sizzle to their article by claiming it was a singularity, which is not likely even hypothetical given the inability to test those predictions, I think.

Thus, it seems wiser to me to explain BBT beginning with Lemaitre's work, which took Slipher's redshifts and Hubble's distances, that introduced the original model for BBT (1927). IOW, start today and work backwards, which is what science has been doing ever since. Would you agree with this approach?

There are a lot of implications this has in every area of math and science but the first and foremost implication is that this means the true Einstein field equations can only ever be solved numerically.
What does "numerically" mean here? Iterations?

... for any initially expanding nontrivial universe that no maximal spatial volume can ever exist...
What do you mean by a "maximal spatial volume"? I would infer this to be an infinite universe, and infinities can be very problematic, which, ironically, is true for the universe at time's start - a singularity.

Thus if light geodesics carrying information on these initial conditions passes into an area which has a higher density(overdensity) of stuff we should expect that that region of space will be expanding slower thus that edge doesn't expand outwards as far as a geodesic passing through an underdensity(void) as the effects of expansion locally depend on the relative rate of intervals of time. I.e. the faster time passes the faster space expands thus in a region where there is underdensities we get a feedback effect where space expands at a faster and faster rate so the distance between points in this case is growing.
But doesn't the 1/100,000 level of anisotropy minimize the problem?

In essence the crucial insight is that for information to be conserved we need to have the metric carry information most naturally in the form of echoes of the local past metric imprinting into the change in volume tensor locally. Thus GR can only be internally self consistent if information is conserved and this in an expanding universe can only be satisfied if information is stored within the local variations in the rate of change of the metric itself.
Are you suggesting some sort critical information is important at t=0?

In fact thanks to the existence of the CMB dipole we have even managed to perform a falsification test on the cosmological principal itself since the principal predicts that the only kind of dipole which can exist in the sky is a purely local kinematic dipole associated with an observers frame of reference. Thus as pointed out by Ellis & Baldwin in 1984 this would require any dipole constructed from cosmologically distant sources to be identical in both magnitude and direction to the dipole in the CMB as if the two are not the same then that means there is a cosmological(i.e. due to the large scale structure and distribution of matter and energy within the universe) component to the dipole.
I've been puzzled by how the dipole is viewed. If we have a good idea of the rate of the Hubble Flow, and we know or speed through it, does the red and blue shifts of the dipole match, or are they off?

This was first rigorously tested by Nathan Secrest et al 2021 using a sample of 1.36 million high redshift quasars from catWISE and the results are staggeringly in disagreement by more than twice the magnitude in particular which gives a 4.9 sigma discrepancy from the CMB dipole.
Ok, that's a quicker answer than I expected. :) [I'll leave the above to help others understand what you're saying. ]

This is helpful as it seems to have something to do with the multiverse view as presented by Laura Mersini-Houghton's book ,"Before the Big Bang", where overlaying quantum physics atop the landscape of string theory produces a somewhat finite number of universes (10^600?), and she says there are six tests for this theory, one being related to the dipole "void", I think she called it. I think she mentioned another test, but not all size, which was curious. [Perhaps I missed them, admittedly.]

Because of your ability to explain this complex field, it would be great if you could explain what science means by things like the "axis of evil", etc. I think I read about a quadrupole issue as well, but many here, like me, are far more familiar with tadpoles instead. ;) Few here have PhDs in anything, but many have BS degrees.

Worse however is independent follow work has tested these results and only raised the discrepancy with the CMB dipole to 5.7 sigma. In this context not even considering the many other lines of evidence challenging the standard model of cosmology i.e. Hubble tension, the axis of evil, many giga parsec scale large structures well beyond the size limit of structure formation in Lambda CDM, the mathematical and logical arguments is presented in a simplified form above etc. there is now overwhelming evidence to call the standard model of cosmology into serious question.
There certainly seems to be, at best, issues to resolve.

But given that Einstein, and perhaps every early cosmologist, assumed homogeneity, does this surprise you much? Quantum quirks were not well-developed in the early decades, dark matter was only a suspicion raised by Zwicky and DE was a total surprise.

In fact dropping the cosmological constant automatically resolves all these tensions. After all the axis of evil problem was to due to the sheer improbability of a presumed kinematic dipole aligning with higher multipoles which themselves in a universe where matter is relatively homogenous and isotropic should all be random in their alignments too rather than all aligned along the same axis in all measurements. All the measurement tensions vanish when you bring in the enormous (dominant) systematic error which effectively swamps any measurement claims by orders of magnitude at cosmological distances.
So, what explains the redshift finding as it relates to distance?
 
  • Like
Reactions: Dragrath
Im not sure why you think that revising their ^CDM model to account for the fact it failed to correctly predict the subsequent JWST observations,
You may want to re-read the article. There is no evidence I see here for the JWST falsifying the mainstream model. In fact, if I'm right as stated in the prior post, that big galaxy looks pretty small since the Pop III stars have been deemed to be extremely massive to burn without the benefits of metals.
 
  • Like
Reactions: patelpb
Helio in post #77 said, "In fact, if I'm right as stated in the prior post, that big galaxy looks pretty small since the Pop III stars have been deemed to be extremely massive to burn without the benefits of metals."

Population III stars in the *early universe* is always an intriguing topic :)

Detection of Phosphorus, Sulphur, and Zinc in the Carbon-enhanced Metal-poor Star BD+44 493, https://ui.adsabs.harvard.edu/abs/2016ApJ...824L..19R/abstract, June 2016. “The carbon-enhanced metal-poor star BD+44°493 ([Fe/H] = -3.9) has been proposed as a candidate second-generation star enriched by metals from a single Pop III star.” My observation. Astronomers measuring BD+44 439 consider this a second-generation Milky Way star, descended from a Population III star. We have not seen Population III stars yet, so the explanation assumes they existed shortly after the big bang.

Hubble makes surprising find in the early universe, https://phys.org/news/2020-06-hubble-early-universe.html, June 2020. “New results from the NASA/ESA Hubble Space Telescope suggest the formation of the first stars and galaxies in the early Universe took place sooner than previously thought. A European team of astronomers have found no evidence of the first generation of stars, known as Population III stars, as far back as when the Universe was just 500 million years old. The exploration of the very first galaxies remains a significant challenge in modern astronomy. We do not know when or how the first stars and galaxies in the Universe formed.”

My note space.com reported on this, 08-Jun-2020, 'The 1st stars in the universe formed earlier than thought', https://forums.space.com/threads/the-1st-stars-in-the-universe-formed-earlier-than-thought.31844/

It is fun to use stars that presently, no one has seen using telescopes to explain problems in BB cosmology :)
 
Nov 10, 2020
57
51
1,610
Visit site
Yes, and this is why I think it best to explain BBT to the general public by not starting with any initial condition, though many like to add sizzle to their article by claiming it was a singularity, which is not likely even hypothetical given the inability to test those predictions, I think.

Thus, it seems wiser to me to explain BBT beginning with Lemaitre's work, which took Slipher's redshifts and Hubble's distances, that introduced the original model for BBT (1927). IOW, start today and work backwards, which is what science has been doing ever since. Would you agree with this approach?

Well yes though it is notably a more tricky thing to do that it looks since space in the nonlinear limits doesn't behave as simply particularly in the sense that the rate of expansion in general becomes a higher rank tensor without the simplifying assumptions , or rather the rate of expansion adds a bit of a self feedback effect into the metric itself which is directionally dependent in space and time in a way which is probably best described as an echo of the past light cone in a given region

What does "numerically" mean here? Iterations?
Correct solving numerically means that you must iteratively solve the equations for every case separately

What do you mean by a "maximal spatial volume"? I would infer this to be an infinite universe, and infinities can be very problematic, which, ironically, is true for the universe at time's start - a singularity.
Yeah this one is a bit of a doozy since yes the limit which the theorem was proved is the infinite spatial limit the key to remember is infinite (which is also not a number) has some frankly weird properties in that some infinities can be bigger or smaller than other infinities. In this case what you are showing is that the rate of change in volume between two time-slices always has the same sign(direction) as the same differential between those time slices. In essence from the perspective of the light cone you can show this is the second law of thermodynamics

But doesn't the 1/100,000 level of anisotropy minimize the problem?
Unfortunately no because the problem has to do with how the equations behave under perturbation from a given equilibria(solution) in essence there are two (technically 3 though the latter isn't relevant here) types of equilibrium possible. A stable equilibrium is at the bottom of a valley like concave depression if plotted on a graph while the opposite an unstable equilibrium is the flipped converse the peak on a hill instead.
For a stable equilibrium under small deviation the function will relax back to the stable equilibrium thus you can mathematically apply perturbation theory to approximate the functions behavior to reasonable accuracy around that point by treating it as a small deviation from that known solution. This is what cosmologists have attempted to do assuming that the Friedmann Lemaitre Robertson Walker metric is such a stable solution.

However what has been shown instead is that the Friedmann Lemaitre Robertson Walker metric is actually the other type of equilibrium an unstable equilibrium this under perturbation will just continue to fall away from the equilibrium point forever as the slope has been shown mathematically to fall forever to both sides with no other equilibria existing. As an analogy think of a small ball perfectly positioned on top of a infinitely tall cone. This is in principal able to remain there so long as things are perfectly balanced but create even the tiniest most infinitesimal deviation and it will fall forever with no possibility of slowing or coming to a stop. This is mathematically what the equilibrium point in question actually is mathematically speaking so it doesn't matter how close to perfect isotropy the initial conditions were the solution can only ever evolve towards larger and larger anisotropy either forward (or backwards) in time from that equilibrium point.

Are you suggesting some sort critical information is important at t=0?
In essence kind of? But its not really anything special with t=0 only the continuity and internal consistency of the solutions over spacetime for any possible choice of initial conditions which in the context of Noether's theorem corresponds to the conservation of information, or more specifically the conservation of the information which keeps that solution unique among all other possible choices for initial conditions. In this sense the time interval we chose to start from is arbitrary what matters is that any other solution within the domain of all possible solutions will still be unique because with differential equations they by definition must have a unique solution for all possible choices of initial conditions(which also ensures uniqueness for all possible time slices forward or backwards in time or any such equivalent transformation in space.
Its really that internal consistency and uniqueness which matters here not the choice of time, the choice of t=0 is a convention in mathematics made because it is the most mathematically convenient you can just as easily and probably would prefer from a cosmological perspective too chose lookback time as your initial conditions but mathematically as long as you are consistent they should give the same answers.

I've been puzzled by how the dipole is viewed. If we have a good idea of the rate of the Hubble Flow, and we know or speed through it, does the red and blue shifts of the dipole match, or are they off?

Ok, that's a quicker answer than I expected. :) [I'll leave the above to help others understand what you're saying. ]
Glad I could help answer your bidding questions in a timely manner yeah this paper is one which I think may ultimately be as important as the Bell tests are in quantum mechanics at challenging either determinism or locality.

This is helpful as it seems to have something to do with the multiverse view as presented by Laura Mersini-Houghton's book ,"Before the Big Bang", where overlaying quantum physics atop the landscape of string theory produces a somewhat finite number of universes (10^600?), and she says there are six tests for this theory, one being related to the dipole "void", I think she called it. I think she mentioned another test, but not all size, which was curious. [Perhaps I missed them, admittedly.]

Um yeah... I generally try and avoid the whole multiverse arguments with as close to an infinitely long pole as possible so I don't have much to say here beyond myself being very skeptical of string theory as a whole with the lack of testable predictions or clear theoretical arguments for justifying their model beyond mathematical beauty.

Because of your ability to explain this complex field, it would be great if you could explain what science means by things like the "axis of evil", etc. I think I read about a quadrupole issue as well, but many here, like me, are far more familiar with tadpoles instead. ;) Few here have PhDs in anything, but many have BS degrees.

Sorry for the jargon, the axis of evil has to do with the alignment of the dipole with the higher CMB multipoles under a series decomposition. Effectively the naive expectation under their model would be that these various multipoles should be random mostly linked to more local clustering effects and our local frame of reference etc. instead the dipole quadrupole and octupole moments are all closely aligned which basically has near zero probability of occurring unless you have a significant nonzero cosmological component to each.

There certainly seems to be, at best, issues to resolve.

But given that Einstein, and perhaps every early cosmologist, assumed homogeneity, does this surprise you much? Quantum quirks were not well-developed in the early decades, dark matter was only a suspicion raised by Zwicky and DE was a total surprise.

Yep that is how science (ideally) works we don't start out knowing the unknown unknowns we have to discover them learning through informed experimental/observational trial and error iteration of hypotheses. The concern about a field should only really arise when there is evidence that different ideas and models are not getting a fair and equal test. The best (albeit still limited) way we have to assess this kind of field bias is to look at the distribution of positive and null papers for an experimental test as statistically you should expect a particular distribution of positive inconclusive and null results even if your model in question is correct, if the distribution is significantly skewed one way or another that is a red flag that some kind of publication bias is at play. This can be similarly performed for incorrect models too, for example some fields and journals have a disproportionately high number of unexpected novel results. Its also important to be careful with keeping track of assumptions especially implicit ones, for example if you average measurements over the sky to increase your effective sample size that is implicitly treating all directions as equivalent i.e. the cosmological principal unless you allow for model variation

So, what explains the redshift finding as it relates to distance?
This is a tricky question because there are multiple factors to account for in a given model you will want to account for identified cosmological and kinematic red and blue shifts what lies along a line of sight, is the source deep in a gravity well, is it dusty/obscured? etc. there are lots of factors involved to keep track of.[/QUOTE]
 

Pax

Feb 14, 2023
3
0
10
Visit site
Per Space.com, The Big Bang Theory is the leading explanation about how the universe began. At its simplest, it says the universe as we know it started with an infinitely hot, infinitely dense singularity, then inflated — first at unimaginable speed, and then at a more measurable rate — over the next 13.8 billion years to the cosmos that we know today.

My question: Where did the infinitely hot, dense singularity come from? What was there before it? Seems Hawking had some questions in his last thoughts, per Stephen Hawking's Final Theory About The Big Bang (scitechdaily.com). How many other universes will we find when we get a better space telescope and find the edge of ours? Infinity IS hard to think about!

It is common to use terms like " infinitely hot, infinitely dense singularity, then inflated — first at unimaginable speed" but that is sort of a cop-out.

What exactly is "infinitely hot". Based on our understanding of "hot", it involves energy - so are we saying there was infinite energy? If energy and matter are two sides of the same coin (E-MC^2), then was there infinite mass?

What exactly is "infinitely dense"? If you remove all the empty space between matter, it can get much denser but that is still finite. In fact, you can't describe anything that has volume if you invoke infinite density.

Of course "unimaginable speed" actually denotes a lack of imagination more than a physical property. As is often noted, "this is the period when our science breaks down" but actually, it is our math that breaks down with stuff like division by zero or infinity. This also invokes some other questions, Can you have unimaginable speed without time? What is the time and/or distance associated with unimaginable speed?

I recognize that these terms are the result of the extrapolation of the math and logic of the BBT but in a science in which we can compute Big Bang nucleosynthesis of the baryon/photon number ratio, as a small number of order 6 × 10−10, it seems inconsistent to invoke descriptive words like infinity and unimaginable for something so critical to our origin story.
Pax
 

Catastrophe

"Science begets knowledge, opinion ignorance.
Helio,

and this is why I think it best to explain BBT to the general public by not starting with any initial condition, though many like to add sizzle to their article by claiming it was a singularity, which is not likely even hypothetical given the inability to test those predictions, I think

Thank you, Helio. You have made my day, or, ;) maybe an infinitely long extended existence ;)

Cat :) :) :)
 
  • Like
Reactions: Helio
Pax, The BB Theorists sort of cop out on the things you speak of by hiding behind the Heisenberg Uncertainty Principle and say that their theory that extrapolates everything back towards a single point doesn't get all the way to that point, and has to stop explaining anything when the extrapolation gets back to Planck size and Planck time. So, the don't have to explain where anyting comes from or what it did at earlier times when it was smaller.

But, that doesn't really address some of the things that you are asking about. The main issue is how something that dense can expand against its own gravity, and how time could pass with the time dilation calculated according to General Relativity Theory.

The theorized answer is "Inflation", which is just postulated to do whatever is necessary to make the theory work, so the extrapolation backwards to that hyper dense, hyper hot condition is somehow able to have gone "forward" in time from then to bcome what we observe now.

There are so many holes in the BBT that are plugged with such unexplained assumptions that I find in very unconvincing. I suspect that we are just not thinking clearly about how things might actually work if we abandon the "principle" that all of space is uniformly like it his here, and it has always changed uniformly in the past.

Recent better observations of the distribution of things in the universe are starting to challenge this "cosmologic principle" that everything is and always has been and always will be uniform enough to make the simplifying assumptions everybody has made to try to solve Einsteins field equations for general relativity.
 
  • Like
Reactions: Dragrath

Pax

Feb 14, 2023
3
0
10
Visit site
<<. . . .stepping back onto his soapbox. . . . .>>

. . . . .And another thing . . . . .
The idea of a singularity as the origin of the BBT. Let's look at this in light of other aspects of cosmology. Specifically:

The Cosmological Principle is derived from the Copernican principle and it implies that the entire Universe is isotropic and homogeneous. Isotropy means the Universe looks the same to all observers and the Universe looks the same in all directions as viewed by a particular observer. Homogeneous means that the average density of matter is about the same in all places in the Universe and the Universe is fairly smooth on large scales.

OK, this is actually an assumption but it is necessary to be able to do much of anything else in cosmology. If it is not accepted then most of the rest of what we thing we know has to be discarded.

That there is no "point of origin" of the BB is also an essential element of our BB theory. This is the basis of the Cosmological Principle and is used to blunt any discussion of direction of the BB from us. This blends into the idea that the BB was an "expansion" and not an explosion.

However, how does that allow for the idea that it all started from a singularity - usually described as being very, very small?

If it was small once and got bigger, then it got bigger from that small point. That is obviously circular logic and it is also stating the obvious. In other words, the singularity WAS the point of origin and the BB began or happened at or wherever that singularity was at the moment of the BB.

By repeating that there was no origin does not make it so if you also contend that there was a singularity at the beginning.

If, as is often said, the BB "happened everywhere", then it can't have been the result of a small singularity.

If, as is often said, the BB originated from a singularity, then it had to have a point of origin - which defies the Cosmological Principle.

Which is it?

You can see that it is here that we get really confused. Existence means that something was/is there. We can say the sun "exists".

Our model says that this singularity "existed" and we must and do accept that as being intuitively true. Existence must also have a place where it exists. To say it did not have a place is to say it did not exist but we have accepted that it did exist.

On the other hand to say that it existed everywhere then how could have been a singularity?

We are left with the conclusion that the BB began from a singularity and that it had to occur at some specific location - which defies the Cosmological Principle but then we have to ignore that dicotomy.

Repeating that there was no origin "place" does not make it so if you also contend that there was a singularity at the beginning.

It may well be that this is really an argument of semantics. It may be that we are just not using words that are precise enough to distinguish differences in existence, direction, expansion, singularity and others as they pertain to the different aspects of this issue. What is important is to acknowledge that making simplistic statements does not resolve such a complex event.

<<. . . .and thus, he steps down from his soapbox. . . . .>>
Pax, The BB Theorists sort of cop out on the things you speak of by hiding behind the Heisenberg Uncertainty Principle and say that their theory that extrapolates everything back towards a single point doesn't get all the way to that point, and has to stop explaining anything when the extrapolation gets back to Planck size and Planck time. So, the don't have to explain where anyting comes from or what it did at earlier times when it was smaller.

But, that doesn't really address some of the things that you are asking about. The main issue is how something that dense can expand against its own gravity, and how time could pass with the time dilation calculated according to General Relativity Theory.

The theorized answer is "Inflation", which is just postulated to do whatever is necessary to make the theory work, so the extrapolation backwards to that hyper dense, hyper hot condition is somehow able to have gone "forward" in time from then to bcome what we observe now.

There are so many holes in the BBT that are plugged with such unexplained assumptions that I find in very unconvincing. I suspect that we are just not thinking clearly about how things might actually work if we abandon the "principle" that all of space is uniformly like it his here, and it has always changed uniformly in the past.

Recent better observations of the distribution of things in the universe are starting to challenge this "cosmologic principle" that everything is and always has been and always will be uniform enough to make the simplifying assumptions everybody has made to try to solve Einsteins field equations for general relativity.

I agree. The BB is just an extrapolation of a series of assumptions based on a guess at what happened. It begins with the idea that we can reverse current conditions over a period of 13.7 billion years and over the associated distance in a linear fashion - that is, we assume a totally consistent compliance with all the currently known laws of physics, movements of bodies, and our standard model for all that time and distance.

To accomplish that, the BBT uses 26 “dimentionless constants” in the math. These are unitless numbers or essentially fudge factors inserted into the math to make it work or to try to approximate observations. They are ALL developed out of other assumptions.

In other words, the BB theory is the best we have right now but it is a house of cards and has numerous inconsistencies and anomalies that point to the distinct possibility that we got it all wrong or at a minimum, don’t have it all right.

This becomes increasingly apparent as we improve our ability to look more closely. We can see black holes and took pictures of their emissions. There are really large planets much closer than we thought. Really young galaxies (red shift > 11) are much larger than we thought. Black holes can be in very young galaxies. Star formation rate is much faster than we thought. And this is just in the past 2 years.

We are quickly becoming more aware of what we don't know. :oops: Even what we thought was known sometimes turns out to be wrong.

What we know about what is unknownable is becoming more known all the time.

We will definitely know more in the future but we first have to admit to ourselves that what we believe we understood of what we think we have discovered may not be what we think it is and, in fact, there might be a lot more that we don't understand than we know. But, of course, you knew that.:)
 
As an engineer with a lot of fluid dynamics background, plus a lot of interest in climate and weather, I tend to think in terms of nonuniform fields of flows, pressures, condensations, etc.

So, when I look at the solutions of the General Relativity Field Equations being "simplified" based on the assumption that the whole universe is essentially uniform, always has been and always will be, I tend to wonder what kind of cosmological processes might really occur that are nonuniform. Could the expansion we see be only part of a much more vast dynamic that has compression elsewhere? Can space itself (whatever all those "fields" that it is composed of) do more than just "bend"? Can they actually "flow" in some way?

If you start the computerized solutions of the Field Equations with nonuniform boundary conditions, what sorts of dynamics of "space" can be modeled?
 
  • Like
Reactions: Dragrath
We will definitely know more in the future but we first have to admit to ourselves that what we believe we understood of what we think we have discovered may not be what we think it is and, in fact, there might be a lot more that we don't understand than we know. But, of course, you knew that.:)
Indeed. The important point in addressing the extrapolations, as you correctly say, is for folks to understand the difference between objective-based arguments and subjective-based ones. The difference is science vs. philosophy, or at best, metaphysics.

Too often the boundary lines between them are blurred with word salads because the dressing tastes so good -- singularities, multi-universe, infinities, trillions of degrees, quantum uncertainties, etc. Such rhetoric often add sizzle to articles, and so nicely that one would swear they're T-bone steaks right off the grill, but they are not even bologna. ;)
 
Nov 10, 2020
57
51
1,610
Visit site
As an engineer with a lot of fluid dynamics background, plus a lot of interest in climate and weather, I tend to think in terms of nonuniform fields of flows, pressures, condensations, etc.

So, when I look at the solutions of the General Relativity Field Equations being "simplified" based on the assumption that the whole universe is essentially uniform, always has been and always will be, I tend to wonder what kind of cosmological processes might really occur that are nonuniform. Could the expansion we see be only part of a much more vast dynamic that has compression elsewhere? Can space itself (whatever all those "fields" that it is composed of) do more than just "bend"? Can they actually "flow" in some way?

If you start the computerized solutions of the Field Equations with nonuniform boundary conditions, what sorts of dynamics of "space" can be modeled?
Yeah the paper by Matthew Kleban and Leonardo Senatore looked into the large scale limit for an inhomogenous and anisotropic universe and frankly there are some remarkable and fascinating results in terms of irreducible nonlinearities which naturally emerge if one doesn't artificially constrain the metric via the "cosmological principal" which cosmologists really need to take more account of.


Frankly while the proof and some of the explored dynamics are fascinating the paper is still held back by the authors only looking in the context of an argument for inflation being inevitable which looks quite weak compared to the real meat of the proof which they failed to fully explore and or tackle related to the mathematical formulation of the very theorem they proved.
Ultimately in the context of information theory you can recognize that the theorem they call the "No big crunch theorem" is really just the second law of thermodynamics in the context of information from a particular point in spacetime propagating outwards.

Another point they seem to have missed exploring the implications of showing that you always have irreducible and asymmetric off diagonal metric tensor elements, as that can serve as a framework for deeper mathematical insights if you recognize that that is mathematically analogous to the property which causes Fermi Dirac statistics to exhibit the Pauli exclusion principal which in the context of information gives a framework which looks to be promising in terms of providing a theoretical framework explaining their results in the context of Noether's theorem with the conjugate conserved property of information and symmetry of logical consistency and causality. In this context the cosmological principal can be seen to be logically forbidden for any and all nontrivial metrics because it necessarily violates information conservation and causality by "forgetting" inconvenient information which makes the math harder. After all by showing the anisotropies are not negligable it is equivelently showing the Friedmann–Lemaître–Robertson–Walker metric solution is an unstable equilibrium and thus any possible solution will always diverge from this under any deviation from the equilibrium solution.

The biggest missed significance of this IMO is however that you are naturally coupling local parameters to universal parameters which they were so close to noting but missed as they were thinking from the context of an absolute reference frames(a problematic concept in the context of GR) If you reframe timeslices as a local observer frame of reference corresponding to the cosmological horizon of any such observer you can link the properties of all such frames of reference to a universal constraint which in this case is total entropy of space proportionally linked to total spatial volume in any such timeslice from any frame of reference. Anyways this means that GR in an expanding universe becomes a nonlocal hidden variable theory which means you can reproduce Bell's inequality observations without breaking causality or internal consistency. Given the above insights it seems natural to hypothesize that the metric is really the sum of such coupled pairings which would be represented most likely in some variation on ER=EPR built from causal self consistency symmetry.

If this is the case one possible testable hypothesis is that the off-diagonal contributions should become significant at small distances again resulting in a repulsive effect counterbalancing the attractive pull of gravity and thus weakening its magnitude. This may therefore result in the weakening of the escape velocity in the extreme relativistic limits of dense compact objects thus concivibly reducing if not preventing the formation of an event horizon as the escape velocity might instead only asymptotically approach the speed of light as the ratio of mass per unit volume becomes large. Ergo you might then expect there to not be any hard transition between neutron star and black hole rather it would be a gradual approximate transition occurring without the formation of a true causal discontinuity.

Thus this predicts both that microblack holes are more unstable and or impossible and that massive compact objects above the Tolman–Oppenheimer–Volkoff limit should exist. (The latter of which may be a perfect candidate for explaining some of the peculiarities of many FRB's IMO)

Also thinking about it as known exotic neutron stars like magnetars masses are poorly constrained I wonder if this might help explain some of these phenomenon as we only see the evidence of their violent outbursts

If I had access to an appropriate supercomputer to test it I have a strong suspicion this can even be used to naturally derive quantization of metric interactions and at the very least a limit based argument appears to suggest that this should result in a non-negligible, non-local, gravitational component in the diagonal elements in addition to the off diagonal components. If this is the case then the limit argument suggests we should naturally expect a transition to a 1/r^3 relationship like in Modified Newtonian Dynamics(MOND) which given that model is more or less supposing an explanation to experimental observations. I.e. we observe MOND out in the universe but we don't know why. This might help fix that and as the nonlinear inhomogenous and anistropic solutions to the Einstein field equations have already been shown to be able to trivially explain "dark energy"
this would allow you to reduce cosmological models to purely "normal" matter with the only remaining independent variables being the initial distribution of this matter/energy and the total size of the actual universe.
 
Sep 11, 2022
97
26
110
Visit site
(...) for folks to understand the difference between objective-based arguments and subjective-based ones. The difference is science vs. philosophy, or at best, metaphysics.

A laudable goal. Remedial instruction in elementary logic would also help.

If A is false, then B must be true.

A number of forum participants run with this while never even bothering to show why A should be relevant nor how A and B are connected.

It goes like this. Alice says, I see no evidence for extraterrestrials piloting flying saucers in our skies.

Bob says, Ah-ha! You believe we are alone in the universe! Such arrogance and such a parochial outlook! Because you are wrong, the opposite must be true and ETs on Earth must be real.

The illogic hurts. Apart from the obvious strawman (Alice never said what Bob claims she said), even if she had said that we are alone in the universe (a bold and unsupported statement), it would not make the presence of ETs here more likely by one iota, as the two statements are not connected in any way that would allow for drawing an inference.
 
It goes like this. Alice says, I see no evidence for extraterrestrials piloting flying saucers in our skies.

Bob says, Ah-ha! You believe we are alone in the universe! Such arrogance and such a parochial outlook! Because you are wrong, the opposite must be true and ETs on Earth must be real.
Yes, this is all too common. Yep, Alice is now an "ET Denier". It is a strong ad hominin that's used to discredit the person to attempt to discredit the correct logic the person has stated. When wondering which one is likely in error, simply looking at who is trying to cancel (personally) whom.
 
Mar 26, 2023
2
0
10
Visit site
Regarding the Big Bang Theory has anyone really looked at Black Holes and wonder what they are doing sucking in all planets in its sphere of influence.
Could the Black Hole be a worm hole to another Interdimentional Universe where it spews out the planets once again rotating in their original orbits ,creating solar systems planets stars and suns in what was a partially empty universe ,or to keep things in balance could this interdimentional Universe be creating it's own Black Holes and creating new planetary systems in our Universe creating a great burst of light when they suddenly appear.
 
Yes, a Black Hole will suck in anything hat gets too near then rip it apart, ionize each atom into elementary particles and add them to its mass.
Yes, a Black Hole might be a portal into another Black Hole elsewhere in the universe. But nothing but elementary particles can enter the black hole so no probe or astronaut could ever enter a B;ack Hole, nor planet, Solar System or galaxy, come back out and resume its form. There is no way to pass the information.
 
Regarding the Big Bang Theory has anyone really looked at Black Holes and wonder what they are doing sucking in all planets in its sphere of influence.
Could the Black Hole be a worm hole to another Interdimentional Universe where it spews out the planets once again rotating in their original orbits ,creating solar systems planets stars and suns in what was a partially empty universe ,or to keep things in balance could this interdimentional Universe be creating it's own Black Holes and creating new planetary systems in our Universe creating a great burst of light when they suddenly appear.
If we are living within the Horizon of the ultimate one, or even a penultimate one, as even some physicists have hinted at whether joking or not, we aren't doing too badly. We, being our own "information," the "information" that was often mentioned by Stephen Hawking, among others, that could exist on both sides of the horizon of a black hole, possibly existing in two places at once, two positions at once, at once the same information regarding that Horizon, to the inside and to the outside of the Horizon. It wouldn't be the first time -- for all of us -- that we were existing both inside a horizon and existing outside of it looking to it at a constant distance from us . . . keeping its distance ever constant to us simply because we exist both inside and outside of it at the same time always while on the surface of the Earth.
 
Last edited:
Regarding the Big Bang Theory has anyone really looked at Black Holes and wonder what they are doing sucking in all planets in its sphere of influence.
Keep in mind that the "suction action" is still limited to its mass. A blackhole the mass of the Sun, for example, would have the exact same "sucking action" as the Sun has now. This BH would have, however, a radius of only a few miles, so objects coming close would be greatly affected.
 
Yes, a Black Hole will suck in anything hat gets too near then rip it apart, ionize each atom into elementary particles and add them to its mass.
Yes, a Black Hole might be a portal into another Black Hole elsewhere in the universe. But nothing but elementary particles can enter the black hole so no probe or astronaut could ever enter a B;ack Hole, nor planet, Solar System or galaxy, come back out and resume its form. There is no way to pass the information.
The term "spaghettification" applies to BHs. For SMBHs, interestingly, the outer region of the EH has a much gentler gravity gradient so astronauts would not be similarly stretched, and would survive entry, but, like the Hotel California, you'll never get out, but there is no wine there. ;)
 
Surely some people here have read Stephen Hawking's 'A Brief History of Time', where the physicist philosopher is giving a lecture on physics and an old lady is disputing him. She tells him the universe sits on the back of a turtle and he, being sure he's got her in a corner, asks her what's underneath the turtle. Her comeback left him speechless, "It's turtles all the way down!" Her comeback left him in a corner.
 
Surely some people here have read Stephen Hawking's 'A Brief History of Time', where the physicist philosopher is giving a lecture on physics and an old lady is disputing him. She tells him the universe sits on the back of a turtle and he, being sure he's got her in a corner, asks her what's underneath the turtle. Her comeback left him speechless, "It's turtles all the way down!" Her comeback left him in a corner.
Yes, that's a great story. I think it was Eddington that gave that account.
 
Helio:
"The term "spaghettification" applies to BHs. For SMBHs, interestingly, the outer region of the EH has a much gentler gravity gradient so astronauts would not be similarly stretched, and would survive entry, but, like the Hotel California, you'll never get out, but there is no wine there."

This is correct, the tearing apart of objects and atoms happens at different radii depending on the size of the Black Hole. For a sufficiently large one, one would not be aware they were crossing the event horizon. Severe tidal forces would not occur until later in the voyage, nearer the singularity.
 

Catastrophe

"Science begets knowledge, opinion ignorance.
OK, this is actually an assumption but it is necessary to be able to do much of anything else in cosmology. If it is not accepted then most of the rest of what we thing we know has to be discarded.

For some time there has been an alternative to, or modification of, the cosmological principle, which issue I have not seen mentioned here. Please correct me, if I have missed anything such.

There are models of universes which are homogeneous but not isotropic . . . . . . If the assumption is relaxed, so that the former condition holds, but not the latter, then the allowed solutions of the equations of general relativity are called Bianchi models, after the Italian mathematician, Luigi Bianchi.
Oxford Dictionary of Astronomy Ed Ian Ridpath OUP 2012.

See also:

Bianchi universes
Scholarpedia
http://www.scholarpedia.org › article › Bianchi_univer...



12 Apr 2017 — Bianchi universes are the class of cosmological models that are homogeneous but not necessarily isotropic on spatial slices, ...


Bianchi classification
Wikipedia
https://en.wikipedia.org › wiki › Bianchi_classification



In mathematics, the Bianchi classification provides a list of all real 3-dimensional Lie algebras (up to isomorphism). The classification contains 11 ...
Classification in dimension 3 · ‎Cosmological application


Bianchi cosmology
Oxford Reference
https://www.oxfordreference.com › view › authority.2...



The study of universes which are homogeneous but not isotropic. Standard cosmological models assume that the Universe, on cosmological scales, ...


Bianchi type I cosmological models
Caltech
https://thesis.library.caltech.edu › ...



by KC Jacobs · 1969 · Cited by 19 — Following this introduction we investigate in great detail anisotropic cosmologies and cosmological models of Bianchi Type I. Our primary goal ...


Cat :)
 
Last edited:

Latest posts