Here's a set of testable hypotheses about dark matter for you.

The “second” is a normalization unit for time, and the normalization factor used to convert unnormalized time into seconds is:

`1/(d_c x t_c)`

, where

`d_c`

is the constant distance light travels in a second and

`t_c`

, which asymptotically approaches zero, is the time experienced by light in a vacuum during that second.

- This definition of is consistent with the accepted understanding of relativistic phenomena approaching light-speed, and it is understandable that human observers (i.e. scientists) have inadvertently normalized against light because that is the limitation of their perception.
- The normalization factor would explain the observed phenomenon that, when measured with seconds, nothing can exceed the speed of light. It is a mathematical consequence of normalization itself and does not imply that faster-than-light movement is impossible.
**If observations of the distribution of dark matter in the universe were calculated using seconds and dark matter were perfectly correlated with **__unnormalized time__, then the observed distribution of dark matter would be uniform everywhere as a mathematical consequence.

Applying this normalization factor to Einstein’s formula for mass-energy conversion,

`E = mc^2`

, and solving for

`t`

yields:

`t = (d_c^3 / c*) x sqrt(m/E)`

where

`c*`

is the unnormalized speed of light, which asymptotically converges to infinity.

- This formula can be interpreted as the time experienced by a
*cube filled with light* during a second adjusted by a factor of the mass-energy ratio of the inertial frame, and the cubic term implies the use of a three-dimensional coordinate system upon which to map that mass and energy.
**If light experiences more time when passing by massive objects and light must maintain its angular momentum (i.e. always travel **`d_c`

in a second), then the light must curve to form an arc that intercepts its straight-line path. The curvature of this arc can be expressed in terms of its localized and unnormalized speed and the unnormalized speed of light in a vacuum. This equation predicts – and could be verified against – the curvature of light used to confirm the Theory of General Relativity, without any prior knowledge of how physicists arrived at that theory.
**If dark matter were perfectly correlated with unnormalized time, then this equation would provide an analytical solution that should match current observations of dark matter that have been transposed from four-dimensional space-time to a three-dimensional Cartesian coordinate grid and adjusted for time normalization.**
- Other tests of this hypothesis abound because this formula suggests that, by substituting unnormalized time for seconds into any formula of the Standard Model, one could arrive at identical solutions for phenomena that previously required the use of space-time as a concept.

I may not be a physicist, but I am an adherent of the scientific method. I recognize that the formula I am providing to you as part of these hypotheses will be hard to swallow because it suggests that gravitational effects do not imply distortions in time, and that unnormalized time (i.e. what we have observed as dark matter, but this hypothesis suggests has been measured incorrectly) may actually be the source of effects we have previously understood to be gravity and other forces.

Under such a hypothesis, blackholes are characterized as being areas of infinite (but not uniformly infinite) unnormalized time. Because we have extinguished the speed-of-light as a universal maximum, we can consider a far simpler explanation for quantum superposition: when a particle moves faster than the speed of light between point A and point B and then back to point A, it will appear as being in multiple places at once before light catches up to the observer. And quantum gravitational effects would be due to unnormalized time as well.

As scientists, our job when confronted with verifiable hypotheses that have explanatory and empirical weight (

__and both of those are substantial in this case__) is to be open to the possibility that our understanding has been incomplete, to test those hypotheses, and to grow collectively from the results. If you are able to verify them, I would have additional model predictions that I could share, and verification should be quick for anyone with a graduate background in physics, which I do not.