Hmm this seems to have been an interesting thread some good questions here particularly the bit on whether we can treat time as passing at the same rate, (quite simply you can not) after all remember that its called General Relativity for a reason there are some important nuances that I think even most physicists miss in part because they aren't recognizing the implications of some of the underlying assumptions they make in their models.
To start I should introduce what may be the most underappreciated paper in all of physics by Matthew Kleban Leonardo Senatore titled "Inhomogeneous and anisotropic cosmology", where even the authors failed to recognize all of the mathematical consequences of their proof, though to be fair some were at least mentioned as possible future work.
The proof which they c all the no big crunch theorem has a number of very curious properties is derived initially in the limit of nontrivial flat or open manifolds of 3 spatial dimensions and 1 time dimension which are initially expanding(or contracting) which have a nontrivial inhomogeneous and anisotropic initial conditions.
This means we are not allowed to do what cosmologists and frankly physicists in general usually do and assume our system is sufficiently symmetric that we can neglect half of the off diagonal terms by assuming that they are symmetric. Keep this in mind as it will be very important.
The clever nature of this proof however is that they take a metamathematical approach and use a proof by self contradiction to show that in this limit that the two conditions needed for the size of a universe to reach a maximum spatial volume at some finite time can never be both satisfied a.k.a. they are mutually self contradictory.
This arises as a consequence of the interaction between gravity and expansion namely related to the effect of time because in a universe with inhomogeneities as you might gather mass is going to gather into over-densities, the clever bit of the proof is they show though computational simulations on supercomputers that there will always be more underdensities produced due to this attraction than over-densities.
The way expansion comes into play however is that even if we naively assume an initially constant rate of expansion the rate of which time passes in general relativity is by definition relative with only the total magnitude of the four velocity vector being fixed at the arbitrary constant we define as c conventionally called the speed of light or more appropriately called the speed of causality. As anyone familiar with special relativity knows acceleration affects the direction of this vector causing length contraction and time dilation, this is in general true within general relativity where additional affects can arise caused by the local metric curvature.
Suffice to say that space that deviates from flatness due to the fixed relative length of the four velocity vector associated with any frame of reference will experience different measures of spatial length and rate of time passing.
Because the rate of expansion in any interval of space is locally dependent this automatically means that the rate of expansion can never be constant unless the universe in question is the trivial null (empty) universe solution. This is to say the rate of expansion anywhere is a local property, we will return to this later as there are some profound implications but for now I'll continue the description of the theorem in the context of the authors work.
More importantly because the tilting of the 4 velocity in any region of space is dependent on its curvature we need to recognize that this means the rate of expansion is slowest between massive bodies an effect which grows stronger the more curvature(gravity) is present this means that over-densities will experience less time and thus less expansion than the underdensities which we know will always outnumber over-densities at a nonlinear rate since this depends on volume.
This as a consequence results in a positive feedback effect with some strange consequences as the rate of expansion locally will now be a nonzero vector quantity with the overall rate of expansion being some kind of higher rank differential tensor, By formalism I suspect it is a rank 3 tensor with stacks of the various time derivatives of the full Einstein field equations as you effectively are stacking matrix time derivatives. Suffice to say this will be an absolute monster to calculate but want matters for now is that in the inhomogeneous and anisotropic case these elements are all irreducibly nonzero and asymmetric. This will have big implications so keep this in mind.
The end formalism which they reached for their proof is ultimately showing that for any initially expanding universe in the limiting cases considered no maximum *total* spatial volume can exist in *any* time-slice of spacetime. Thus any initially expanding universe will grow forever and conversely any initially contracting universe will contract forever with these two outcomes being the time reversed counterparts. This ultimately was where the authors left things aside from the mention that you can likely represent a sufficiently large closed universe in the same manner mathematically because it is after all the rate of expansion compared to the concentration of matter. From this point on the paper shifts focus to inflation which I would argue distracts us from something even more fundamental.
What first caught my eye from this paper is how suspiciously similar the effect of these off diagonal elements are to what cosmologists call "dark energy" and notably there is literature which has found that these terms are under the typical assumptions cosmologists make indistinguishable from dark energy though unlike dark energy they will always be directionally dependent. (Which is what has been found by research which doesn't implicitly or explicitly assume the cosmological principal is valid)
Now perhaps more important here is the last formalism the authors use as it seems suspiciously similar to the famous second law of thermodynamics if we swap out the total spatial volume for the quantity known as entropy. More importantly for this context is that they showed this total spatial volume is related to a scalar value quantity which effectively determines the end state of any universe. These two properties both being present at the same time as the criterion for the Einstein field equations to be self consistent for all possible choices of initial conditions screams to apply information theory.
After all in information theory information is defined as that which is needed to completely describe the sate of a system. In information theory entropy can naturally be derived which for physical systems we know generally corresponds to thermodynamic entropy. Notably we know thanks to both theory and observations that solutions in general relativity are path dependent. This to cosmologists is inconvenient and thus generally ignored by assuming that these path dependencies should cancel out however as we see from the proof above this is *not* a valid assumption ever.
In fact from a thought experiment considering how information propagates in an expanding universe we can see that the link between entropy here must be related to the total volumetric light cone since if we are to look at any local time-slice of space all the information including that which has now left the local cosmological horizon must still be accounted for as other wise you are no longer solving the same system of differential equations. Now here I made note of the similarity between this and Hawking's work only that here we have a volume dependence rather than an area dependence. Curious no? What I have eventually recognized is that the time slices are a local frame dependent property thus we can think of this volume as a sum of all possible paths. However what happens in the limit where all paths are equal i.e. the metric is "flat"?
Now for any given time-slice we can construct a volumetric horizon surface and thus we get a path integration along a volumetric surface which in the flat space limit will reduce to a 3 dimensional line integral. This from vector calculus should instantly be recognizable as stokes theorem which lets us convert our integral to a surface integral over the curl of the surface's area vector element. Thus the hawking radiation entropy per unit time slice for a cosmological event horizon is likely a special case of a more general volumetric entropy per unit time-slice.
You would still need to integrate over all possible time slices so its not quite finished of course. In fact to finish this correctly we would have to extend this to the total volume of the universe not just ones local horizon of the observable universe but integrating over the set of all possible cosmological horizons, to get the actual volume entropy relationship for a given Universe. Naturally that will be an extremely nonlinear mathematical nightmare with no analytical solutions, but it does provide a kind of a limiting case argument.
Regardless the No big crunch theorem appears to be a more general form of the second law of thermodynamics where we link the total volume of a universe to a cosmological constraint on all metrics this notably gives us not only the laws of thermodynamics and the arrow of time but due to coupling local and nonlocal universal properties we now meet the criterium of a nonlocal hidden variable theory which means the Einstein field equations become naturally consistent will Bell's inequality
Next we can go back to a related insight that the off-diagonal elements of the metric are in general irreducibly nonzero and asymmetric, if you remember quantum mechanics you might be familiar with an integration of this form namely the integration over asymmetric wavefunctions within the Schrodinger equation, which is what leads to the Pauli exclusion principal or Fermi Dirac statistics.
Thus it is possible to recognize that this implies that if the metric is to be quantized something similar must occur this time in relation to elements of the metric leading to a surprising conclusion that not only does the rate of expansion in the metric evolve but it does so as a automatic consequence of applying conservation of information to the Einstein field equations in such a way that the metric at all possible points in spacetime (within such a quantum limit) must be *uniquely* defined.
So in essence all observers must be uniquely special and we can likely represent the metric (or at least the off diagonal elements) in the quantum limit as a sum over spinor couplings of possible (and or impossible) causal interactions.
So we in essence get that for the full Einstein field equations to be internally self consistent that information conservation and causality must hold for all possible initial conditions. This can only be possible if the Einstein field equations include nonzero nonlocal elements which when factored in results in the metric at all points in spacetime to be uniquely defined consistent with a sum over spinor states corresponding to all possible informational interaction pairs.
What thing in physics is both nonlocal and involves coupled interaction pairs consistent with spinors obeying Fermi Dirac statistics? Why I know of only one thing that meets both criteria... quantum entanglement.
Thus in effect if you treat the spinor couplets as imaginary wormholes you can effectively derive that the criterion for internal consistency is really the same as the ER=EPR conjecture.
Without this condition the Einstein field equations can not be internally self consistent and thus we can eliminate any and all possible variations of the so called cosmological constant as being logically invalid for any and all possible nontrivial solutions.
So sure you can solve Lambda CDM but that will never be an actual solution to the Einstein field equations and thus general relativity because the terms which are ignored blow up to dominate the metric at cosmological distances. This also notably resolves the so called information paradox in well a frankly trivial manner while also completely eliminating "dark energy".
And all it requires is to actually solve the real Einstein field equations not the convoluted and now mathematically falsified simplified model which is quite frankly just ignoring 6 of the 16 linearly independent differential equations that compose the Einstein field equations because they make the mathematics involved irreducibly nonlinear.
And its kind of dumbfounding how the metric appears to quite literally self quantize just by stopping dropping terms(while eliminating the wild goose chase that is dark energy and quite possibly more and effectively implies that Einstein unknowingly derived quantum gravity in 1915 we just didn't notice because the full mathematics needed were well beyond what humans are capable of computing in a human lifetime.
Oh I also should note that if the minimum gravitational contribution applies not only to the off diagonal nonlinear terms but to the nonlocal diagonal terms you can potentially derive MOND like spline gravity as a consequence of the relative strengths between the classical local contributions and the nonlocal quantum contributions since gravity becomes the integral sum of relative flux of entanglement/decoherence between any given qbit elements in the universe . The magnitude of this contribution would be a function of the true size of the universe and total matter content but it suggests you might even be able to eliminate some or possibly even all dark matter from cosmological models in which case you can explain the universe purely through the full nonlinear Einstein field equations with normal matter and energy as the sole components. Occam's razor strikes again.
Yes this is computationally daunting to say the least and quite possibly/probably means that we will never be able to know the end fate of the universe but it doesn't matter if the physics is pretty it only matters if its right.
To start I should introduce what may be the most underappreciated paper in all of physics by Matthew Kleban Leonardo Senatore titled "Inhomogeneous and anisotropic cosmology", where even the authors failed to recognize all of the mathematical consequences of their proof, though to be fair some were at least mentioned as possible future work.
The proof which they c all the no big crunch theorem has a number of very curious properties is derived initially in the limit of nontrivial flat or open manifolds of 3 spatial dimensions and 1 time dimension which are initially expanding(or contracting) which have a nontrivial inhomogeneous and anisotropic initial conditions.
This means we are not allowed to do what cosmologists and frankly physicists in general usually do and assume our system is sufficiently symmetric that we can neglect half of the off diagonal terms by assuming that they are symmetric. Keep this in mind as it will be very important.
The clever nature of this proof however is that they take a metamathematical approach and use a proof by self contradiction to show that in this limit that the two conditions needed for the size of a universe to reach a maximum spatial volume at some finite time can never be both satisfied a.k.a. they are mutually self contradictory.
This arises as a consequence of the interaction between gravity and expansion namely related to the effect of time because in a universe with inhomogeneities as you might gather mass is going to gather into over-densities, the clever bit of the proof is they show though computational simulations on supercomputers that there will always be more underdensities produced due to this attraction than over-densities.
The way expansion comes into play however is that even if we naively assume an initially constant rate of expansion the rate of which time passes in general relativity is by definition relative with only the total magnitude of the four velocity vector being fixed at the arbitrary constant we define as c conventionally called the speed of light or more appropriately called the speed of causality. As anyone familiar with special relativity knows acceleration affects the direction of this vector causing length contraction and time dilation, this is in general true within general relativity where additional affects can arise caused by the local metric curvature.
Suffice to say that space that deviates from flatness due to the fixed relative length of the four velocity vector associated with any frame of reference will experience different measures of spatial length and rate of time passing.
Because the rate of expansion in any interval of space is locally dependent this automatically means that the rate of expansion can never be constant unless the universe in question is the trivial null (empty) universe solution. This is to say the rate of expansion anywhere is a local property, we will return to this later as there are some profound implications but for now I'll continue the description of the theorem in the context of the authors work.
More importantly because the tilting of the 4 velocity in any region of space is dependent on its curvature we need to recognize that this means the rate of expansion is slowest between massive bodies an effect which grows stronger the more curvature(gravity) is present this means that over-densities will experience less time and thus less expansion than the underdensities which we know will always outnumber over-densities at a nonlinear rate since this depends on volume.
This as a consequence results in a positive feedback effect with some strange consequences as the rate of expansion locally will now be a nonzero vector quantity with the overall rate of expansion being some kind of higher rank differential tensor, By formalism I suspect it is a rank 3 tensor with stacks of the various time derivatives of the full Einstein field equations as you effectively are stacking matrix time derivatives. Suffice to say this will be an absolute monster to calculate but want matters for now is that in the inhomogeneous and anisotropic case these elements are all irreducibly nonzero and asymmetric. This will have big implications so keep this in mind.
The end formalism which they reached for their proof is ultimately showing that for any initially expanding universe in the limiting cases considered no maximum *total* spatial volume can exist in *any* time-slice of spacetime. Thus any initially expanding universe will grow forever and conversely any initially contracting universe will contract forever with these two outcomes being the time reversed counterparts. This ultimately was where the authors left things aside from the mention that you can likely represent a sufficiently large closed universe in the same manner mathematically because it is after all the rate of expansion compared to the concentration of matter. From this point on the paper shifts focus to inflation which I would argue distracts us from something even more fundamental.
What first caught my eye from this paper is how suspiciously similar the effect of these off diagonal elements are to what cosmologists call "dark energy" and notably there is literature which has found that these terms are under the typical assumptions cosmologists make indistinguishable from dark energy though unlike dark energy they will always be directionally dependent. (Which is what has been found by research which doesn't implicitly or explicitly assume the cosmological principal is valid)
Now perhaps more important here is the last formalism the authors use as it seems suspiciously similar to the famous second law of thermodynamics if we swap out the total spatial volume for the quantity known as entropy. More importantly for this context is that they showed this total spatial volume is related to a scalar value quantity which effectively determines the end state of any universe. These two properties both being present at the same time as the criterion for the Einstein field equations to be self consistent for all possible choices of initial conditions screams to apply information theory.
After all in information theory information is defined as that which is needed to completely describe the sate of a system. In information theory entropy can naturally be derived which for physical systems we know generally corresponds to thermodynamic entropy. Notably we know thanks to both theory and observations that solutions in general relativity are path dependent. This to cosmologists is inconvenient and thus generally ignored by assuming that these path dependencies should cancel out however as we see from the proof above this is *not* a valid assumption ever.
In fact from a thought experiment considering how information propagates in an expanding universe we can see that the link between entropy here must be related to the total volumetric light cone since if we are to look at any local time-slice of space all the information including that which has now left the local cosmological horizon must still be accounted for as other wise you are no longer solving the same system of differential equations. Now here I made note of the similarity between this and Hawking's work only that here we have a volume dependence rather than an area dependence. Curious no? What I have eventually recognized is that the time slices are a local frame dependent property thus we can think of this volume as a sum of all possible paths. However what happens in the limit where all paths are equal i.e. the metric is "flat"?
Now for any given time-slice we can construct a volumetric horizon surface and thus we get a path integration along a volumetric surface which in the flat space limit will reduce to a 3 dimensional line integral. This from vector calculus should instantly be recognizable as stokes theorem which lets us convert our integral to a surface integral over the curl of the surface's area vector element. Thus the hawking radiation entropy per unit time slice for a cosmological event horizon is likely a special case of a more general volumetric entropy per unit time-slice.
You would still need to integrate over all possible time slices so its not quite finished of course. In fact to finish this correctly we would have to extend this to the total volume of the universe not just ones local horizon of the observable universe but integrating over the set of all possible cosmological horizons, to get the actual volume entropy relationship for a given Universe. Naturally that will be an extremely nonlinear mathematical nightmare with no analytical solutions, but it does provide a kind of a limiting case argument.
Regardless the No big crunch theorem appears to be a more general form of the second law of thermodynamics where we link the total volume of a universe to a cosmological constraint on all metrics this notably gives us not only the laws of thermodynamics and the arrow of time but due to coupling local and nonlocal universal properties we now meet the criterium of a nonlocal hidden variable theory which means the Einstein field equations become naturally consistent will Bell's inequality
Next we can go back to a related insight that the off-diagonal elements of the metric are in general irreducibly nonzero and asymmetric, if you remember quantum mechanics you might be familiar with an integration of this form namely the integration over asymmetric wavefunctions within the Schrodinger equation, which is what leads to the Pauli exclusion principal or Fermi Dirac statistics.
Thus it is possible to recognize that this implies that if the metric is to be quantized something similar must occur this time in relation to elements of the metric leading to a surprising conclusion that not only does the rate of expansion in the metric evolve but it does so as a automatic consequence of applying conservation of information to the Einstein field equations in such a way that the metric at all possible points in spacetime (within such a quantum limit) must be *uniquely* defined.
So in essence all observers must be uniquely special and we can likely represent the metric (or at least the off diagonal elements) in the quantum limit as a sum over spinor couplings of possible (and or impossible) causal interactions.
So we in essence get that for the full Einstein field equations to be internally self consistent that information conservation and causality must hold for all possible initial conditions. This can only be possible if the Einstein field equations include nonzero nonlocal elements which when factored in results in the metric at all points in spacetime to be uniquely defined consistent with a sum over spinor states corresponding to all possible informational interaction pairs.
What thing in physics is both nonlocal and involves coupled interaction pairs consistent with spinors obeying Fermi Dirac statistics? Why I know of only one thing that meets both criteria... quantum entanglement.
Thus in effect if you treat the spinor couplets as imaginary wormholes you can effectively derive that the criterion for internal consistency is really the same as the ER=EPR conjecture.
Without this condition the Einstein field equations can not be internally self consistent and thus we can eliminate any and all possible variations of the so called cosmological constant as being logically invalid for any and all possible nontrivial solutions.
So sure you can solve Lambda CDM but that will never be an actual solution to the Einstein field equations and thus general relativity because the terms which are ignored blow up to dominate the metric at cosmological distances. This also notably resolves the so called information paradox in well a frankly trivial manner while also completely eliminating "dark energy".
And all it requires is to actually solve the real Einstein field equations not the convoluted and now mathematically falsified simplified model which is quite frankly just ignoring 6 of the 16 linearly independent differential equations that compose the Einstein field equations because they make the mathematics involved irreducibly nonlinear.
And its kind of dumbfounding how the metric appears to quite literally self quantize just by stopping dropping terms(while eliminating the wild goose chase that is dark energy and quite possibly more and effectively implies that Einstein unknowingly derived quantum gravity in 1915 we just didn't notice because the full mathematics needed were well beyond what humans are capable of computing in a human lifetime.
Oh I also should note that if the minimum gravitational contribution applies not only to the off diagonal nonlinear terms but to the nonlocal diagonal terms you can potentially derive MOND like spline gravity as a consequence of the relative strengths between the classical local contributions and the nonlocal quantum contributions since gravity becomes the integral sum of relative flux of entanglement/decoherence between any given qbit elements in the universe . The magnitude of this contribution would be a function of the true size of the universe and total matter content but it suggests you might even be able to eliminate some or possibly even all dark matter from cosmological models in which case you can explain the universe purely through the full nonlinear Einstein field equations with normal matter and energy as the sole components. Occam's razor strikes again.
Yes this is computationally daunting to say the least and quite possibly/probably means that we will never be able to know the end fate of the universe but it doesn't matter if the physics is pretty it only matters if its right.