Discussing the effects of "inflation" on the measurement of temperature seems too hard to corral into a simple physics kinetic discussion. So, I want to step away from the 10 meter cubic box used in the previous thought experiments and instead use the cosmic microwave background radiation to think about some potentially relevant factors. That has the advantage of being a real phenomena with actual observations that don't have to be imagined. But, the interpretation of that phenomenon has involved a lot of imagination. So, my process is to try to look for alternative assumptions, see what differences they would make in conclusions, and then see if those assumptions can be tested.
As I understand the theory for the inflation of the universe to have resulted in the background microwaves, the whole universe was filled with what is basically hydrogen plasma - protons and free electrons. Expansion of space itself let the plasma expand and thus cool, until it reached a point where hydrogen atoms formed by the protons capturing electrons into quantum state orbitals, making neutral hydrogen atoms, which cannot absorb photons below a certain energy value
that exceeds the energy needed to strip electrons from their proton nuclei. So, once electrons became bound to protons to make electrically neutral atoms of hydrogen, the photon radiation already being emitted and reabsorbed in the universe was able to travel without being absorbed. That radiation spectrum is thought to represent a black body temperature of about 3,000 K. A factor of 1090 of additional expansion of space
since that point in time about 13.4 billion years ago is thought to have stretched the wave length of the photons traveling through it, so that the resultant wave length match the black body radiation spectrum of the temperature 2.725 K as we see that radiation, today.
Also, at that time, stars were not yet thought to have formed.
Working from that, I have several questions.
First, why could stars not have formed while the universe was filled with hydrogen plasma, since that is what the earliest stars were made of to begin with. Plasma has mass, and mass concentrations create stars when they collapse and make the hydrogen plasma dense enough and hot enough to fuse into helium plasma. How do we know that plasma concentrations that were starting from an average that is 1090 times what it is today and would obviously have had variations in density could not
have developed some stars doing fusion while the universe was still opaque to photons?
Second, what is the ionization energy: I found this: "[h]ydrogen is present almost entirely in the form of neutral hydrogen (H I) for temperatures below about 7000 K, but above that temperature there is a rapid transition so that above about 10,000 K the hydrogen is present almost completely as ionized hydrogen (H II). In the region 7000 - 10,000 K the gas is a mixture of H I and H II." See http://csep10.phys.utk.edu/OJTA2dev/ojta/c2c/ordinary_stars/harvard/ionization_tl.html
How does the CBR temperature of 3000K fit this picture for hydrogen ionization as measured in our part of the universe at our current time? Does the BBT make some adjustments for hydrogen ionization energy due to differences in hydrogen atoms' electron energy states as a function of the amount of inflation those atoms have experienced? I have not heard of that, but it seems like a valid question to think about. If inflation is indeed expanding everything, including atoms, then it is logical to ask if that would change the measured properties of those atoms.
That is all I have time for right now. It would be helpful if somebody with more familiarity with the assumptions
and the resulting calculations used in the BBT would address the 2 questions that I have just posted.